Sample records for increasingly common tools

  1. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  2. Online Tools to Support the Delivery of Evidence-Based Practices for Students with ASD

    ERIC Educational Resources Information Center

    Sam, Ann M.; Kucharczyk, Suzanne; Waters, Victoria

    2018-01-01

    Educators continually encounter new challenges that require different tools or ways to utilize current tools in novel ways. Common challenges when working with students with autism spectrum disorder (ASD) may include addressing interfering behavior, developing communication systems, increasing social opportunities for students, and addressing…

  3. On the evolutionary and ontogenetic origins of tool-oriented behaviour in New Caledonian crows (Corvus moneduloides)

    PubMed Central

    KENWARD, BEN; SCHLOEGL, CHRISTIAN; RUTZ, CHRISTIAN; WEIR, ALEXANDER A. S.; BUGNYAR, THOMAS; KACELNIK, ALEX

    2015-01-01

    New Caledonian crows (Corvus moneduloides) are prolific tool users in captivity and in the wild, and have an inherited predisposition to express tool-oriented behaviours. To further understand the evolution and development of tool use, we compared the development of object manipulation in New Caledonian crows and common ravens (Corvus corax), which do not routinely use tools. We found striking qualitative similarities in the ontogeny of tool-oriented behaviour in New Caledonian crows and food-caching behaviour in ravens. Given that the common ancestor of New Caledonian crows and ravens was almost certainly a caching species, we therefore propose that the basic action patterns for tool use in New Caledonian crows may have their evolutionary origins in caching behaviour. Noncombinatorial object manipulations had similar frequencies in the two species. However, frequencies of object combinations that are precursors to functional behaviour increased in New Caledonian crows and decreased in ravens throughout the study period, ending 6 weeks post-fledging. These quantitative observations are consistent with the hypothesis that New Caledonian crows develop tool-oriented behaviour because of an increased motivation to perform object combinations that facilitate the necessary learning. PMID:25892825

  4. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    ERIC Educational Resources Information Center

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  5. A review of cultural adaptations of screening tools for autism spectrum disorders.

    PubMed

    Soto, Sandra; Linas, Keri; Jacobstein, Diane; Biel, Matthew; Migdal, Talia; Anthony, Bruno J

    2015-08-01

    Screening children to determine risk for Autism Spectrum Disorders has become more common, although some question the advisability of such a strategy. The purpose of this systematic review is to identify autism screening tools that have been adapted for use in cultures different from that in which they were developed, evaluate the cultural adaptation process, report on the psychometric properties of the adapted instruments, and describe the implications for further research and clinical practice. A total of 21 articles met criteria for inclusion, reporting on the cultural adaptation of autism screening in 19 countries and in 10 languages. The cultural adaptation process was not always clearly outlined and often did not include the recommended guidelines. Cultural/linguistic modifications to the translated tools tended to increase with the rigor of the adaptation process. Differences between the psychometric properties of the original and adapted versions were common, indicating the need to obtain normative data on populations to increase the utility of the translated tool. © The Author(s) 2014.

  6. Multimedia Instructional Tools and Student Learning in a Computer Applications Course

    ERIC Educational Resources Information Center

    Chapman, Debra L.; Wang, Shuyan

    2015-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology in the classroom. Multimedia instructional tools (MMIT) provide student-centered active-learning instructional activities. MMITs are common in introductory computer applications courses based on the premise that MMITs should increase student…

  7. Efficacy of Handheld Electronic Visual Supports to Enhance Vocabulary in Children with ASD

    ERIC Educational Resources Information Center

    Ganz, Jennifer B.; Boles, Margot B.; Goodwyn, Fara D.; Flores, Margaret M.

    2014-01-01

    Although electronic tools such as handheld computers have become increasingly common throughout society, implementation of such tools to improve skills in individuals with intellectual and developmental disabilities has lagged in the professional literature. However, the use of visual scripts for individuals with disabilities, particularly those…

  8. Problem Solving and Training Guide for Shipyard Industrial Engineers

    DTIC Science & Technology

    1986-06-01

    Design Integration Tools Building 192 Room 128 9500 MacArthur Blvd Bethesda, MD 20817-5700 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...called upon to increase the knowledge about industrial engineering of some shipyard group. The Curriculum is seen especially as a tool to identify new...materials on all common machine shop tools . Data permits calculation of machining time. 085 Ostwald, Phillip F. AMERICAN MACHINIST MANUFACTURING COST

  9. Use of performance measurement to include air quality and energy into mileage-based user fees.

    DOT National Transportation Integrated Search

    2012-03-01

    Road pricing is an increasingly popular tool for achieving a number of transportation policy related goals and objectives. Addressing environmental concerns is a common goal of road pricing systems in Europe but is less common in the U.S., and framew...

  10. High speed metal removal

    NASA Astrophysics Data System (ADS)

    Pugh, R. F.; Pohl, R. F.

    1982-10-01

    Four types of steel (AISI 1340, 4140, 4340, and HF-1) which are commonly used in large caliber projectile manufacture were machined at different hardness ranges representing the as-forged and the heat treated condition with various ceramic tools using ceramic coated tungsten carbide as a reference. Results show that machining speeds can be increased significantly using present available tooling.

  11. Ultra-high surface speed for metal removal, artillery shell

    NASA Astrophysics Data System (ADS)

    Pugh, R. F.; Walsh, M. R.; Pohl, R. F.

    1981-07-01

    Four types of steel (AISI 1340, 4140, 4340, and HF-1) which are commonly used in large caliber projectile manufacture were machined with five types of tools at different hardness ranges representing the as-forged and the heat-treated condition. Results show that machining speeds can be increased significantly over current practice using the present available tooling.

  12. Allometric equations for urban ash trees (Fraxinus spp.) in Oakville, Southern Ontario, Canada

    Treesearch

    Paula J. Peper; Claudia P. Alzate; John W. McNeil; Jalil Hashemi

    2014-01-01

    Tree growth equations are an important and common tool used to effectively assess the yield and determine management practices in forest plantations. Increasingly, they are being developed for urban forests, providing tools to assist urban forest managers with species selection, placement, and estimation of management costs and ecosystem services. This study describes...

  13. The database management system: A topic and a tool

    NASA Technical Reports Server (NTRS)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  14. Concept relationship editor: a visual interface to support the assertion of synonymy relationships between taxonomic classifications

    NASA Astrophysics Data System (ADS)

    Craig, Paul; Kennedy, Jessie

    2008-01-01

    An increasingly common approach being taken by taxonomists to define the relationships between taxa in alternative hierarchical classifications is to use a set-based notation which states relationship between two taxa from alternative classifications. Textual recording of these relationships is cumbersome and difficult for taxonomists to manage. While text based GUI tools are beginning to appear which ease the process, these have several limitations. Interactive visual tools offer greater potential to allow taxonomists to explore the taxa in these hierarchies and specify such relationships. This paper describes the Concept Relationship Editor, an interactive visualisation tool designed to support the assertion of relationships between taxonomic classifications. The tool operates using an interactive space-filling adjacency layout which allows users to expand multiple lists of taxa with common parents so they can explore and assert relationships between two classifications.

  15. Industrial energy systems and assessment opportunities

    NASA Astrophysics Data System (ADS)

    Barringer, Frank Leonard, III

    Industrial energy assessments are performed primarily to increase energy system efficiency and reduce energy costs in industrial facilities. The most common energy systems are lighting, compressed air, steam, process heating, HVAC, pumping, and fan systems, and these systems are described in this document. ASME has produced energy assessment standards for four energy systems, and these systems include compressed air, steam, process heating, and pumping systems. ASHRAE has produced an energy assessment standard for HVAC systems. Software tools for energy systems were developed for the DOE, and there are software tools for almost all of the most common energy systems. The software tools are AIRMaster+ and LogTool for compressed air systems, SSAT and 3E Plus for steam systems, PHAST and 3E Plus for process heating systems, eQUEST for HVAC systems, PSAT for pumping systems, and FSAT for fan systems. The recommended assessment procedures described in this thesis are used to set up an energy assessment for an industrial facility, collect energy system data, and analyze the energy system data. The assessment recommendations (ARs) are opportunities to increase efficiency and reduce energy consumption for energy systems. A set of recommended assessment procedures and recommended assessment opportunities are presented for each of the most common energy systems. There are many assessment opportunities for industrial facilities, and this thesis describes forty-three ARs for the seven different energy systems. There are seven ARs for lighting systems, ten ARs for compressed air systems, eight ARs for boiler and steam systems, four ARs for process heating systems, six ARs for HVAC systems, and four ARs for both pumping and fan systems. Based on a history of past assessments, average potential energy savings and typical implementation costs are shared in this thesis for most ARs. Implementing these ARs will increase efficiency and reduce energy consumption for energy systems in industrial facilities. This thesis does not explain all energy saving ARs that are available, but does describe the most common ARs.

  16. Tool compounds robustly increase turnover of an artificial substrate by glucocerebrosidase in human brain lysates.

    PubMed

    Berger, Zdenek; Perkins, Sarah; Ambroise, Claude; Oborski, Christine; Calabrese, Matthew; Noell, Stephen; Riddell, David; Hirst, Warren D

    2015-01-01

    Mutations in glucocerebrosidase (GBA1) cause Gaucher disease and also represent a common risk factor for Parkinson's disease and Dementia with Lewy bodies. Recently, new tool molecules were described which can increase turnover of an artificial substrate 4MUG when incubated with mutant N370S GBA1 from human spleen. Here we show that these compounds exert a similar effect on the wild-type enzyme in a cell-free system. In addition, these tool compounds robustly increase turnover of 4MUG by GBA1 derived from human cortex, despite substantially lower glycosylation of GBA1 in human brain, suggesting that the degree of glycosylation is not important for compound binding. Surprisingly, these tool compounds failed to robustly alter GBA1 turnover of 4MUG in the mouse brain homogenate. Our data raise the possibility that in vivo models with humanized glucocerebrosidase may be needed for efficacy assessments of such small molecules.

  17. Small scale sequence automation pays big dividends

    NASA Technical Reports Server (NTRS)

    Nelson, Bill

    1994-01-01

    Galileo sequence design and integration are supported by a suite of formal software tools. Sequence review, however, is largely a manual process with reviewers scanning hundreds of pages of cryptic computer printouts to verify sequence correctness. Beginning in 1990, a series of small, PC based sequence review tools evolved. Each tool performs a specific task but all have a common 'look and feel'. The narrow focus of each tool means simpler operation, and easier creation, testing, and maintenance. Benefits from these tools are (1) decreased review time by factors of 5 to 20 or more with a concomitant reduction in staffing, (2) increased review accuracy, and (3) excellent returns on time invested.

  18. Human eye haptics-based multimedia.

    PubMed

    Velandia, David; Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron

    2014-01-01

    Immersive and interactive multimedia applications offer complementary study tools in anatomy as users can explore 3D models while obtaining information about the organ, tissue or part being explored. Haptics increases the sense of interaction with virtual objects improving user experience in a more realistic manner. Common eye studying tools are books, illustrations, assembly models, and more recently these are being complemented with mobile apps whose 3D capabilities, computing power and customers are increasing. The goal of this project is to develop a complementary eye anatomy and pathology study tool using deformable models within a multimedia application, offering the students the opportunity for exploring the eye from up close and within with relevant information. Validation of the tool provided feedback on the potential of the development, along with suggestions on improving haptic feedback and navigation.

  19. Importance and pitfalls of molecular analysis to parasite epidemiology.

    PubMed

    Constantine, Clare C

    2003-08-01

    Molecular tools are increasingly being used to address questions about parasite epidemiology. Parasites represent a diverse group and they might not fit traditional population genetic models. Testing hypotheses depends equally on correct sampling, appropriate tool and/or marker choice, appropriate analysis and careful interpretation. All methods of analysis make assumptions which, if violated, make the results invalid. Some guidelines to avoid common pitfalls are offered here.

  20. Development of a downed woody debris forecasting tool using strategic-scale multiresource forest inventories

    Treesearch

    Matthew B. Russell; Christopher W. Woodall

    2017-01-01

    The increasing interest in forest biomass for energy or carbon cycle purposes has raised the need for forest resource managers to refine their understanding of downed woody debris (DWD) dynamics. We developed a DWD forecasting tool using field measurements (mean size and stage of stage of decay) for three common forest types across the eastern United States using field...

  1. Scalability Assessments for the Malicious Activity Simulation Tool (MAST)

    DTIC Science & Technology

    2012-09-01

    the scalability characteristics of MAST. Specifically, we show that an exponential increase in clients using the MAST software does not impact...an exponential increase in clients using the MAST software does not impact network and system resources significantly. Additionally, we...31 1. Hardware .....................................31 2. Software .....................................32 3. Common PC

  2. The Utilization of Network Enabled Capability in NATO Air C2 and Targeting Systems

    DTIC Science & Technology

    2013-06-01

    TBMCS Mission Exchange 3.1 ICC-JCOP JCOP aims to provide a common operational picture to the NATO users to increase the situational awareness by...transformation steps (Figure 9). Figure 9: WISI consumed by Oracle’s SOA tools 3.4 ICC-ACCS- TBMCS Mission Exchange The aim of this experiment was to... TBMCS ) using a common methodology. For this purpose, initially, a common mission definition (CMD) was defined which had the same meaning for all of

  3. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  4. New Common Proper-Motion Pairs with R.A. Between 00h and 01h

    NASA Astrophysics Data System (ADS)

    Caballero, Rafael

    2015-07-01

    This paper presents 37 new common proper-motion pairs. The new pairs have been obtained employing a semi-automatic procedure based on the inspection of images using the tool Aladin, completed with information obtained from the catalogs available at VizieR. All the pairs fulfill the Halbwachs criteria, employed to increase the probability of a physical bond between the two components.

  5. Tools for Teaching Virtual Teams: A Comparative Resource Review

    ERIC Educational Resources Information Center

    Larson, Barbara; Leung, Opal; Mullane, Kenneth

    2017-01-01

    As the ubiquity of virtual work--and particularly virtual project teams--increases in the professional environment, management and other professional programs are increasingly teaching students skills related to virtual work. One of the most common forms of teaching virtual work skills is a virtual team project, in which students collaborate with…

  6. Childhood CT scans linked to leukemia and brain cancer later in life

    Cancer.gov

    Children and young adults scanned multiple times by computed tomography (CT), a commonly used diagnostic tool, have a small increased risk of leukemia and brain tumors in the decade following their first scan.

  7. Common Moles, Atypical Moles (Dysplastic Nevi), and Risk of Melanoma

    MedlinePlus

    ... Data Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... freckles have an increased chance of melanoma. Certain medical conditions or medicines : Medical conditions or medicines (such ...

  8. An investigation of chatter and tool wear when machining titanium

    NASA Technical Reports Server (NTRS)

    Sutherland, I. A.

    1974-01-01

    The low thermal conductivity of titanium, together with the low contact area between chip and tool and the unusually high chip velocities, gives rise to high tool tip temperatures and accelerated tool wear. Machining speeds have to be considerably reduced to avoid these high temperatures with a consequential loss of productivity. Restoring this lost productivity involves increasing other machining variables, such as feed and depth-of-cut, and can lead to another machining problem commonly known as chatter. This work is to acquaint users with these problems, to examine the variables that may be encountered when machining a material like titanium, and to advise the machine tool user on how to maximize the output from the machines and tooling available to him. Recommendations are made on ways of improving tolerances, reducing machine tool instability or chatter, and improving productivity. New tool materials, tool coatings, and coolants are reviewed and their relevance examined when machining titanium.

  9. Providing the Tools for Information Sharing: Net-Centric Enterprise Services

    DTIC Science & Technology

    2007-07-01

    The Department of Defense (DoD) is establishing a net-centric environment that increasingly leverages shared services and Service-Oriented...transformational program that delivers a set of shared services as part of the DoD’s common infrastructure to enable networked joint force capabilities, improved interoperability, and increased information sharing across mission area services.

  10. Agricultural Geophysics: Past, present, and future

    USDA-ARS?s Scientific Manuscript database

    Geophysical methods are becoming an increasingly valuable tool for agricultural applications. Agricultural geophysics investigations are commonly (although certainly not always) focused on delineating small- and/or large-scale objects/features within the soil profile (~ 0 to 2 m depth) over very lar...

  11. Diagnostic tools in ocular allergy.

    PubMed

    Leonardi, A; Doan, S; Fauquert, J L; Bozkurt, B; Allegri, P; Marmouz, F; Rondon, C; Jedrzejczak, M; Hellings, P; Delgado, L; Calder, V

    2017-10-01

    Ocular allergy (OA) includes a group of common and less frequent hypersensitivity disorders frequently misdiagnosed and not properly managed. The diagnosis of OA is usually based on clinical history and signs and symptoms, with the support of in vivo and in vitro tests when identification of the specific allergen is required. To date, no specific test is available for the diagnosis of the whole spectrum of the different forms of OA. The lack of recommendations on diagnosis of OA is considered a medical need not only for allergists but also for ophthalmologists. This position paper aims to provide a comprehensive overview of the currently available tools for diagnosing OA to promote a common nomenclature and procedures to be used by different specialists. Questionnaires, sign and symptom grading scales, tests, and potential biomarkers for OA are reviewed. We also identified several unmet needs in the diagnostic tools to generate interest, increase understanding, and inspire further investigations. Tools, recommendations, and algorithms for the diagnosis of OA are proposed for use by both allergists and ophthalmologists. Several unmet needs in the diagnostic tools should be further improved by specific clinical research in OA. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  12. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  13. Common data model for natural language processing based on two existing standard information models: CDA+GrAF.

    PubMed

    Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D

    2012-08-01

    An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  15. Application of geophysical methods to agriculture: An overview

    USDA-ARS?s Scientific Manuscript database

    Geophysical methods are becoming an increasingly valuable tool for agricultural applications. Agricultural geophysics investigations are commonly (although certainly not always) focused on delineating small- and/or large-scale objects/features within the soil profile (~ 0 to 2 m depth) over very lar...

  16. Successful Clicker Standardization

    ERIC Educational Resources Information Center

    Twetten, Jim; Smith, M. K.; Julius, Jim; Murphy-Boyer, Linda

    2007-01-01

    Student response systems, commonly referred to as "clickers," have become an important learning tool in higher education. With a growing number of faculty using the technology to promote active learning, student engagement, and assessment, most campuses have seen increasing clicker use. And with faculty bombarded by multiple,…

  17. Differential Tuning of Ventral and Dorsal Streams during the Generation of Common and Uncommon Tool Uses.

    PubMed

    Matheson, Heath E; Buxbaum, Laurel J; Thompson-Schill, Sharon L

    2017-11-01

    Our use of tools is situated in different contexts. Prior evidence suggests that diverse regions within the ventral and dorsal streams represent information supporting common tool use. However, given the flexibility of object concepts, these regions may be tuned to different types of information when generating novel or uncommon uses of tools. To investigate this, we collected fMRI data from participants who reported common or uncommon tool uses in response to visually presented familiar objects. We performed a pattern dissimilarity analysis in which we correlated cortical patterns with behavioral measures of visual, action, and category information. The results showed that evoked cortical patterns within the dorsal tool use network reflected action and visual information to a greater extent in the uncommon use group, whereas evoked neural patterns within the ventral tool use network reflected categorical information more strongly in the common use group. These results reveal the flexibility of cortical representations of tool use and the situated nature of cortical representations more generally.

  18. The Common Sense Guide to the Common Core: Teacher-Tested Tools for Implementation

    ERIC Educational Resources Information Center

    McKnight, Katherine

    2014-01-01

    Based on the original source document for the Common Core State Standards and tested by 1,000 educators in diverse classrooms across the country, these research-based tools will help readers examine their current practices and adapt existing curriculum. Each of the 40 tools is clearly presented, explained, and exemplified, guiding educators…

  19. Distributed Pedagogical Leadership and Generative Dialogue in Educational Nodes

    ERIC Educational Resources Information Center

    Jappinen, Aini-Kristiina; Sarja, Anneli

    2012-01-01

    The article presents practices of distributed pedagogical leadership and generative dialogue as a tool with which management and personnel can better operate in the increasingly turbulent world of education. Distributed pedagogical leadership includes common characteristics of a professional learning community when the educational actors…

  20. Common features of microRNA target prediction tools

    PubMed Central

    Peterson, Sarah M.; Thompson, Jeffrey A.; Ufkin, Melanie L.; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates

    2014-01-01

    The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output. PMID:24600468

  1. Common features of microRNA target prediction tools.

    PubMed

    Peterson, Sarah M; Thompson, Jeffrey A; Ufkin, Melanie L; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates

    2014-01-01

    The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.

  2. Evaluating a standardised clinical assessment tool for pre-registration midwifery students: A cross-sectional survey of midwifery students and midwives in Australia.

    PubMed

    Morrow, Jane; Biggs, Laura; Stelfox, Sara; Phillips, Diane; McKellar, Lois; McLachlan, Helen

    2016-02-01

    Assessment of clinical competence is a core component of midwifery education. Clinical assessment tools have been developed to help increase consistency and overcome subjectivity of assessment. The study had two main aims. The first was to explore midwifery students and educators/clinical midwives' views and experiences of a common clinical assessment tool used for all preregistration midwifery programmes in Victoria and the University of South Australia. The second was to assess the need for changes to the tool to align with developments in clinical practice and evidence-based care. A cross-sectional, web-based survey including Likert-type scales and open-ended questions was utilised. Students enrolled in all four entry pathways to midwifery at seven Victorian and one South Australian university and educators/clinical midwives across both states. One hundred and ninety-one midwifery students' and 86 educators/clinical midwives responded. Overall, students and educators/clinical midwives were positive about the Clinical Assessment Tool with over 90% reporting that it covered the necessary midwifery skills. Students and educators/clinical midwives reported high levels of satisfaction with the content of the learning tools. Only 4% of educators/clinical midwives and 6% of students rated the Clinical Assessment Tool as poor overall. Changes to some learning tools were necessary in order to reflect recent practice and evidence. A common clinical assessment tool for evaluating midwifery students' clinical practice may facilitate the provision of consistent, reliable and objective assessment of student skills and competency. Copyright © 2015 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  3. Two NextGen Air Safety Tools: An ADS-B Equipped UAV and a Wake Turbulence Estimator

    NASA Astrophysics Data System (ADS)

    Handley, Ward A.

    Two air safety tools are developed in the context of the FAA's NextGen program. The first tool addresses the alarming increase in the frequency of near-collisions between manned and unmanned aircraft by equipping a common hobby class UAV with an ADS-B transponder that broadcasts its position, speed, heading and unique identification number to all local air traffic. The second tool estimates and outputs the location of dangerous wake vortex corridors in real time based on the ADS-B data collected and processed using a custom software package developed for this project. The TRansponder based Position Information System (TRAPIS) consists of data packet decoders, an aircraft database, Graphical User Interface (GUI) and the wake vortex extension application. Output from TRAPIS can be visualized in Google Earth and alleviates the problem of pilots being left to imagine where invisible wake vortex corridors are based solely on intuition or verbal warnings from ATC. The result of these two tools is the increased situational awareness, and hence safety, of human pilots in the National Airspace System (NAS).

  4. War and Peace in International Relations Theory: A Classroom Simulation

    ERIC Educational Resources Information Center

    Sears, Nathan Alexander

    2018-01-01

    Simulations are increasingly common pedagogical tools in political science and international relations courses. This article develops a classroom simulation that aims to facilitate students' theoretical understanding of the topic of war and peace in international relations, and accomplishes this by incorporating important theoretical concepts…

  5. Enhancing Understanding of Transformation Matrices

    ERIC Educational Resources Information Center

    Dick, Jonathan; Childrey, Maria

    2012-01-01

    With the Common Core State Standards' emphasis on transformations, teachers need a variety of approaches to increase student understanding. Teaching matrix transformations by focusing on row vectors gives students tools to create matrices to perform transformations. This empowerment opens many doors: Students are able to create the matrices for…

  6. Improving Learning Experiences through Gamification: A Case Study

    ERIC Educational Resources Information Center

    Geelan, Benjamin; de Salas, Kristy; Lewis, Ian; King, Carolyn; Edwards, Dale; O'Mara, Aidan

    2015-01-01

    Gamified learning systems are becoming increasingly common within educational institutions, however there is a lack of understanding on the elements of gamification that influence, either positively or negatively, the learning experiences of students using these systems. This study examines an existing gamified learning tool implemented within an…

  7. Visualization and Analytics Tools for Infectious Disease Epidemiology: A Systematic Review

    PubMed Central

    Carroll, Lauren N.; Au, Alan P.; Detwiler, Landon Todd; Fu, Tsung-chieh; Painter, Ian S.; Abernethy, Neil F.

    2014-01-01

    Background A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) Identify public health user needs and preferences for infectious disease information visualization tools; (2) Identify existing infectious disease information visualization tools and characterize their architecture and features; (3) Identify commonalities among approaches applied to different data types; and (4) Describe tool usability evaluation efforts and barriers to the adoption of such tools. Methods We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. Results A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. Discussion and Conclusion As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. PMID:24747356

  8. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    PubMed

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Moving Frost Hardy Genes From Wild to Cultivated Potatoes. Use of Precise Screening Tools to Make Real Progress

    USDA-ARS?s Scientific Manuscript database

    The common cultivated species Solanum tubrosum is frost sensitive and is killed at temperatures below -2.5°C. It has been estimated that by increasing frost hardiness by 1–2 C one can expect an increase in potato yield by 26 to 40% in the Altiplano (Peru and Bolivia) covering 63,000 ha. of potatoes....

  10. A comparative evaluation of genome assembly reconciliation tools.

    PubMed

    Alhakami, Hind; Mirebrahim, Hamid; Lonardi, Stefano

    2017-05-18

    The majority of eukaryotic genomes are unfinished due to the algorithmic challenges of assembling them. A variety of assembly and scaffolding tools are available, but it is not always obvious which tool or parameters to use for a specific genome size and complexity. It is, therefore, common practice to produce multiple assemblies using different assemblers and parameters, then select the best one for public release. A more compelling approach would allow one to merge multiple assemblies with the intent of producing a higher quality consensus assembly, which is the objective of assembly reconciliation. Several assembly reconciliation tools have been proposed in the literature, but their strengths and weaknesses have never been compared on a common dataset. We fill this need with this work, in which we report on an extensive comparative evaluation of several tools. Specifically, we evaluate contiguity, correctness, coverage, and the duplication ratio of the merged assembly compared to the individual assemblies provided as input. None of the tools we tested consistently improved the quality of the input GAGE and synthetic assemblies. Our experiments show an increase in contiguity in the consensus assembly when the original assemblies already have high quality. In terms of correctness, the quality of the results depends on the specific tool, as well as on the quality and the ranking of the input assemblies. In general, the number of misassemblies ranges from being comparable to the best of the input assembly to being comparable to the worst of the input assembly.

  11. Nursing Home Medication Reconciliation: A Quality Improvement Initiative.

    PubMed

    Tong, Monica; Oh, Hye Young; Thomas, Jennifer; Patel, Sheila; Hardesty, Jennifer L; Brandt, Nicole J

    2017-04-01

    The current quality improvement initiative evaluated the medication reconciliation process within select nursing homes in Washington, DC. The identification of common types of medication discrepancies through monthly retrospective chart reviews of newly admitted patients in two different nursing homes were described. The use of high-risk medications, namely antidiabetic, anticoagulant, and opioid agents, was also recorded. A standardized spreadsheet tool based on multiple medication reconciliation implementation tool kits was created to record the information. The five most common medication discrepancies were incorrect indication (21%), no monitoring parameters (17%), medication name omitted (11%), incorrect dose (10%), and incorrect frequency (8%). Antidiabetic agents in both sites were the most used high-risk medication. This initiative highlights that medication discrepancies on admission are common in nursing homes and may be clinically impactful. More attention needs to be given to work flow processes to improve medication reconciliation considering the increased risk for adverse drug events and hospitalizations. [Journal of Gerontological Nursing and Mental Health Services, 43(4), 9-14.]. Copyright 2017, SLACK Incorporated.

  12. The Value of Experiential Learning in Long-Term Care Education

    ERIC Educational Resources Information Center

    Wasmuth, Norma

    1975-01-01

    Experiential learning has proved a useful tool in adding meaning to an undergraduate course in the problems of aging and delivery of long-term care. Sensory deprivation and institutionalization commonly experienced by the elderly can be simulated. The response to this educational process increased the students' understanding of sensory…

  13. Organic rice disease management using genetic resistance, cover crop and organic fertilizer

    USDA-ARS?s Scientific Manuscript database

    The strong market demand for organic rice has driven the continued increase of organic rice production in the US. However, growers still lack effective tools to manage narrow brown leaf spot (NBLS) caused by Cercospora janseana and brown spot caused by Cochliobolus miyabeanus, two common diseases af...

  14. L2 Identity, Discourse, and Social Networking in Russian

    ERIC Educational Resources Information Center

    Klimanova, Liudmila; Dembovskaya, Svetlana

    2013-01-01

    As the integration of Internet-based social networking tools becomes increasingly popular in foreign language classrooms, the use of modern communication technologies is particularly critical in the context of less commonly taught languages (LCTLs), where student exposure to the target language and its speakers is usually minimal. This paper…

  15. Standardized reporting of functioning information on ICF-based common metrics.

    PubMed

    Prodinger, Birgit; Tennant, Alan; Stucki, Gerold

    2018-02-01

    In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians and researchers to continue using their tools while still being able to compare and aggregate the information within and across tools.

  16. A Comprehensive Look at Polypharmacy and Medication Screening Tools for the Older Cancer Patient

    PubMed Central

    DeGregory, Kathlene A.; Morris, Amy L.; Ramsdale, Erika E.

    2016-01-01

    Inappropriate medication use and polypharmacy are extremely common among older adults. Numerous studies have discussed the importance of a comprehensive medication assessment in the general geriatric population. However, only a handful of studies have evaluated inappropriate medication use in the geriatric oncology patient. Almost a dozen medication screening tools exist for the older adult. Each available tool has the potential to improve aspects of the care of older cancer patients, but no single tool has been developed for this population. We extensively reviewed the literature (MEDLINE, PubMed) to evaluate and summarize the most relevant medication screening tools for older patients with cancer. Findings of this review support the use of several screening tools concurrently for the elderly patient with cancer. A deprescribing tool should be developed and included in a comprehensive geriatric oncology assessment. Finally, prospective studies are needed to evaluate such a tool to determine its feasibility and impact in older patients with cancer. Implications for Practice: The prevalence of polypharmacy increases with advancing age. Older adults are more susceptible to adverse effects of medications. “Prescribing cascades” are common, whereas “deprescribing” remains uncommon; thus, older patients tend to accumulate medications over time. Older patients with cancer are at high risk for adverse drug events, in part because of the complexity and intensity of cancer treatment. Additionally, a cancer diagnosis often alters assessments of life expectancy, clinical status, and competing risk. Screening for polypharmacy and potentially inappropriate medications could reduce the risk for adverse drug events, enhance quality of life, and reduce health care spending for older cancer patients. PMID:27151653

  17. Malnutrition in Hospitalized Pediatric Patients: Assessment, Prevalence, and Association to Adverse Outcomes.

    PubMed

    Daskalou, Efstratia; Galli-Tsinopoulou, Assimina; Karagiozoglou-Lampoudi, Thomais; Augoustides-Savvopoulou, Persefone

    2016-01-01

    Malnutrition is a frequent finding in pediatric health care settings in the form of undernutrition or excess body weight. Its increasing prevalence and impact on overall health status, which is reflected in the adverse outcomes, renders imperative the application of commonly accepted and evidence-based practices and tools by health care providers. Nutrition risk screening on admission and nutrition status evaluation are key points during clinical management of hospitalized pediatric patients, in order to prevent health deterioration that can lead to serious complications and growth consequences. In addition, anthropometric data based on commonly accepted universal growth standards can give accurate results for nutrition status. Both nutrition risk screening and nutrition status assessment are techniques that should be routinely implemented, based on commonly accepted growth standards and methodology, and linked to clinical outcomes. The aim of the present review was to address the issue of hospital malnutrition in pediatric settings in terms of prevalence, outline nutrition status evaluation and nutrition screening process using different criteria and available tools, and present its relationship with outcome measures. Key teaching points • Malnutrition-underweight or excess body weight-is a frequent imbalance in pediatric settings that affects physical growth and results in undesirable clinical outcomes. • Anthropometry interpretation through growth charts and nutrition screening are cornerstones for the assessment of malnutrition.To date no commonly accepted anthropometric criteria or nutrition screening tools are used in hospitalized pediatric patients. • Commonly accepted nutrition status and screening processes based on the World Health Organization's growth standards can contribute to the overall hospital nutrition care of pediatric patients.

  18. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.

    2014-12-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip

  19. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"

  20. The COMPASS Project

    NASA Astrophysics Data System (ADS)

    Duley, A. R.; Sullivan, D.; Fladeland, M. M.; Myers, J.; Craig, M.; Enomoto, F.; Van Gilst, D. P.; Johan, S.

    2011-12-01

    The Common Operations and Management Portal for Airborne Science Systems (COMPASS) project is a multi-center collaborative effort to advance and extend the research capabilities of the National Aeronautics and Space Administration's (NASA) Airborne Science Program (ASP). At its most basic, COMPASS provides tools for visualizing the position of aircraft and instrument observations during the course of a mission, and facilitates dissemination, discussion, and analysis and of multiple disparate data sources in order to more efficiently plan and execute airborne science missions. COMPASS targets a number of key objectives. First, deliver a common operating picture for improved shared situational awareness to all participants in NASA's Airborne Science missions. These participants include scientists, engineers, managers, and the general public. Second, encourage more responsive and collaborative measurements between instruments on multiple aircraft, satellites, and on the surface in order to increase the scientific value of these measurements. Fourth, provide flexible entry points for data providers to supply model and advanced analysis products to mission team members. Fifth, provide data consumers with a mechanism to ingest, search and display data products. Finally, embrace an open and transparent platform where common data products, services, and end user components can be shared with the broader scientific community. In pursuit of these objectives, and in concert with requirements solicited by the airborne science research community, the COMPASS project team has delivered a suite of core tools intended to represent the next generation toolset for airborne research. This toolset includes a collection of loosely coupled RESTful web-services, a system to curate, register, and search, commonly used data sources, end-user tools which leverage web socket and other next generation HTML5 technologies to aid real time aircraft position and data visualization, and an extensible a framework to rapidly accommodate mission specific requirements and mission tools.

  1. Improving Freezing Tolerance of Cultivated Potatoes: Moving Frost Hardy Genes From Wild Potatoes and Making Real Progress Using Precise Screening Tools

    USDA-ARS?s Scientific Manuscript database

    The common cultivated species Solanum tubrosum is frost sensitive and is killed at temperatures below -2.5°C. It has been estimated that by increasing frost hardiness by 1–2 C one can expect an increase in potato yield by 26 to 40% in the Altiplano (Peru and Bolivia) covering 63,000 ha. of potatoes....

  2. Biologically inspired robotic inspectors: the engineering reality and future outlook (Keynote address)

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph

    2005-04-01

    Human errors have long been recognized as a major factor in the reliability of nondestructive evaluation results. To minimize such errors, there is an increasing reliance on automatic inspection tools that allow faster and consistent tests. Crawlers and various manipulation devices are commonly used to perform variety of inspection procedures that include C-scan with contour following capability to rapidly inspect complex structures. The emergence of robots has been the result of the need to deal with parts that are too complex to handle by a simple automatic system. Economical factors are continuing to hamper the wide use of robotics for inspection applications however technology advances are increasingly changing this paradigm. Autonomous robots, which may look like human, can potentially address the need to inspect structures with configuration that are not predetermined. The operation of such robots that mimic biology may take place at harsh or hazardous environments that are too dangerous for human presence. Biomimetic technologies such as artificial intelligence, artificial muscles, artificial vision and numerous others are increasingly becoming common engineering tools. Inspired by science fiction, making biomimetic robots is increasingly becoming an engineering reality and in this paper the state-of-the-art will be reviewed and the outlook for the future will be discussed.

  3. Analysis of Facial Injuries Caused by Power Tools.

    PubMed

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  4. Virtual Manufacturing Techniques Designed and Applied to Manufacturing Activities in the Manufacturing Integration and Technology Branch

    NASA Technical Reports Server (NTRS)

    Shearrow, Charles A.

    1999-01-01

    One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.

  5. Comparing MCDA Aggregation Methods in Constructing Composite Indicators Using the Shannon-Spearman Measure

    ERIC Educational Resources Information Center

    Zhou, P.; Ang, B. W.

    2009-01-01

    Composite indicators have been increasingly recognized as a useful tool for performance monitoring, benchmarking comparisons and public communication in a wide range of fields. The usefulness of a composite indicator depends heavily on the underlying data aggregation scheme where multiple criteria decision analysis (MCDA) is commonly used. A…

  6. What We Can Teach When We Teach (About) Religion

    ERIC Educational Resources Information Center

    Hickman, Larry A.

    2016-01-01

    Given the increasing diversity of religious beliefs and outlooks in the United States, John Dewey's proposals regarding "a common faith" can help educators provide the tools for their students to think critically about these and other issues related to the changing religious landscape. Particular attention is given to three groups of…

  7. Real-Time Geospatial Data Viewer (RETIGO): Web-Based Tool for Researchers and Citizen Scientists to Explore their Air Measurements

    EPA Science Inventory

    The collection of air measurements in real-time on moving platforms, such as wearable, bicycle-mounted, or vehicle-mounted air sensors, is becoming an increasingly common method to investigate local air quality. However, visualizing and analyzing geospatial air monitoring data re...

  8. Mechanical site preparation for forest restoration

    Treesearch

    Magnus Lof; Daniel C. Dey; Rafael M. Navarro; Douglass F. Jacobs

    2012-01-01

    Forest restoration projects have become increasingly common around the world and planting trees is almost always a key component. Low seedling survival and growth may result in restoration failures and various mechanical site preparation techniques for treatment of soils and vegetation are important tools used to help counteract this. In this article, we synthesize the...

  9. Using Wireless Response Systems to Replicate Behavioral Research Findings in the Classroom

    ERIC Educational Resources Information Center

    Cleary, Anne M.

    2008-01-01

    College instructors are increasingly relying on wireless clicker systems as instructional tools in the classroom. Instructors commonly use clicker systems for such classroom activities as taking attendance, giving quizzes, and taking opinion polls. However, these systems are uniquely well suited for the teaching of psychology and other courses…

  10. Enhancing Secondary Science Content Accessibility with Video Games

    ERIC Educational Resources Information Center

    Marino, Matthew T.; Becht, Kathleen M.; Vasquez, Eleazar, III; Gallup, Jennifer L.; Basham, James D.; Gallegos, Benjamin

    2014-01-01

    Mobile devices, including iPads, tablets, and so on, are common in high schools across the country. Unfortunately, many secondary teachers see these devices as distractions rather than tools for scaffolding instruction. This article highlights current research related to the use of video games as a means to increase the cognitive and social…

  11. Indigenous Knowledge as a Tool for Self-Determination and Liberation.

    ERIC Educational Resources Information Center

    Hill, Dawn Martin

    This paper explores aspects of Indigenous knowledge on several levels and examines the role of Indigenous knowledge in Indigenous empowerment as the number and influence of Native people in academia increases. Indigenous peoples worldwide have a common set of assumptions that forms a context or paradigm--a collective core of interrelated…

  12. The aquamet Package for R: A Tool for Use with the National Rivers and Streams Assessment

    EPA Science Inventory

    The use of R software in environmental data analysis has become increasingly common because it is very powerful, versatile and available free of charge, with hundreds of contributed add-on packages available that perform almost every conceivable type of analysis or task. The Envi...

  13. Perceptions of Health Impact Assessments in Influencing, Policy Decisions through Health Communications

    ERIC Educational Resources Information Center

    Ford, Cheryl

    2016-01-01

    Over the past 30 years, public health Practitioners worldwide have increasingly relied on Health Impact Assessments (HIAs) as a tool for informing decision makers of the potential health impacts of proposed policies, programs, and planning decisions. Adoption of the HIA is significantly less common in the United States than in international…

  14. Process Evaluation of the Instant Word Notebook

    ERIC Educational Resources Information Center

    Roberts, Jeannie Ellen

    2010-01-01

    This program evaluation of The Instant Word Notebook was conducted by two educators who created an instructional tool to teach and assess the most frequently occurring words in written text, commonly known as Instant Words. In an effort to increase the reading scores of first and second grade students, teachers were instructed to teach Instant…

  15. Development and formative evaluation of a visual e-tool to help decision makers navigate the evidence around health financing.

    PubMed

    Skordis-Worrall, Jolene; Pulkki-Brännström, Anni-Maria; Utley, Martin; Kembhavi, Gayatri; Bricki, Nouria; Dutoit, Xavier; Rosato, Mikey; Pagel, Christina

    2012-12-21

    There are calls for low and middle income countries to develop robust health financing policies to increase service coverage. However, existing evidence around financing options is complex and often difficult for policy makers to access. To summarize the evidence on the impact of financing health systems and develop an e-tool to help decision makers navigate the findings. After reviewing the literature, we used thematic analysis to summarize the impact of 7 common health financing mechanisms on 5 common health system goals. Information on the relevance of each study to a user's context was provided by 11 country indicators. A Web-based e-tool was then developed to assist users in navigating the literature review. This tool was evaluated using feedback from early users, collected using an online survey and in-depth interviews with key informants. The e-tool provides graphical summaries that allow a user to assess the following parameters with a single snapshot: the number of relevant studies available in the literature, the heterogeneity of evidence, where key evidence is lacking, and how closely the evidence matches their own context. Users particularly liked the visual display and found navigating the tool intuitive. However there was concern that a lack of evidence on positive impact might be construed as evidence against a financing option and that the tool might over-simplify the available financing options. Complex evidence can be made more easily accessible and potentially more understandable using basic Web-based technology and innovative graphical representations that match findings to the users' goals and context.

  16. [Actinic keratosis: New concept and therapeutic update].

    PubMed

    Carmena-Ramón, Rafael; Mateu-Puchades, Almudena; Santos-Alarcón, Sergio; Lucas-Truyols, Sofía

    2017-10-01

    Actinic keratosis (AK) is a common reason for consultation in both Primary Care and Specialised Care. It is the third or fourth most common reason for consultation in dermatology, accounting for up to 5-6% of patients attended. It has also been observed that its prevalence has been increasing in the last 10years, compared to other dermatoses. This is also expected to continue to increase due to longer life expectancy, and by the changes in sun exposure habits since the middle of the last century. The aim of this article is to update the concepts of AK, cancerisation field and to present the currently available therapeutic tools. Copyright © 2017. Publicado por Elsevier España, S.L.U.

  17. Training generalized improvisation of tools by preschool children1

    PubMed Central

    Parsonson, Barry S.; Baer, Donald M.

    1978-01-01

    The development of new, “creative” behaviors was examined in a problem-solving context. One form of problem solving, improvisation, was defined as finding a substitute to replace the specifically designated, but currently unavailable, tool ordinarily used to solve the problem. The study examined whether preschool children spontaneously displayed generalized improvisation skills, and if not, whether they could be trained to do so within different classes of tools. Generalization across different tool classes was monitored but not specifically trained. Five preschool children participated in individual sessions that first probed their skill at improvising tools, and later trained and probed generalized improvisation in one or more of three tool classes (Hammers, Containers, and Shoelaces), using a multiple-baseline design. All five children were trained with Hammers, two were trained in two classes, and two were trained in all three tool classes. Four of the five children improvised little in Baseline. During Training, all five showed increased generalized improvisation within the trained class, but none across classes. Tools fabricated by item combinations were rare in Baseline, but common in Training. Followup probes showed that the training effects were durable. PMID:16795596

  18. Using competences and competence tools in workforce development.

    PubMed

    Green, Tess; Dickerson, Claire; Blass, Eddie

    The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs.

  19. Distributed Observatory Management

    NASA Astrophysics Data System (ADS)

    Godin, M. A.; Bellingham, J. G.

    2006-12-01

    A collection of tools for collaboratively managing a coastal ocean observatory have been developed and used in a multi-institutional, interdisciplinary field experiment. The Autonomous Ocean Sampling Network program created these tools to support the Adaptive Sampling and Prediction (ASAP) field experiment that occurred in Monterey Bay in the summer of 2006. ASAP involved the day-to-day participation of a large group of researchers located across North America. The goal of these investigators was to adapt an array of observational assets to optimize data collection and analysis. Achieving the goal required continual interaction, but the long duration of the observatory made sustained co-location of researchers difficult. The ASAP team needed a remote collaboration tool, the capability to add non-standard, interdisciplinary data sets to the overall data collection, and the ability to retrieve standardized data sets from the collection. Over the course of several months and "virtual experiments," the Ocean Observatory Portal (COOP) collaboration tool was created, along with tools for centralizing, cataloging, and converting data sets into common formats, and tools for generating automated plots of the common format data. Accumulating the data in a central location and converting the data to common formats allowed any team member to manipulate any data set quickly, without having to rely heavily on the expertise of data generators to read the data. The common data collection allowed for the development of a wide range of comparison plots and allowed team members to assimilate new data sources into derived outputs such as ocean models quickly. In addition to the standardized outputs, team members were able to produce their own specialized products and link to these through the collaborative portal, which made the experimental process more interdisciplinary and interactive. COOP was used to manage the ASAP vehicle program from its start in July 2006. New summaries were posted to the COOP tool on a daily basis, and updated with announcements on schedule, system status, voting results from previous day, ocean, atmosphere, hardware, adaptive sampling and coordinated control and forecast. The collection of standardized data files was used to generate daily plots of observed and predicted currents, temperature, and salinity. Team members were able to participate from any internet-accessible location using common Internet browsers, and any team member could add to the day's summary, point out trends and discuss observations, and make an adaptation proposal. If a team member submitted a proposal, team-wide discussion and voting followed. All interactions were archived and left publicly accessible so that future experiments could be made more systematic with increased automation. The need for collaboration and data handling tools is important for future ocean observatories, which will require 24-hour per day, 7-day a week interactions over many years. As demonstrated in the ASAP experiment, the COOP tool and associated data handling tools allowed scientists to coherently and collaboratively manage an ocean observatory, without being co-located at the observatory. Lessons learned from operating these collaborative tools during the ASAP experiment provide an important foundation for creating even more capable portals.

  20. Dependency visualization for complex system understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, J. Allison Cory

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less

  1. U-Pb Ages of Lunar Apatites

    NASA Technical Reports Server (NTRS)

    Vaughan, J.; Nemchin, A. A.; Pidgeon, R. T.; Meyer, Charles

    2006-01-01

    Apatite is one of the minerals that is rarely utilized in U-Pb geochronology, compared to some other U-rich accessory phases. Relatively low U concentration, commonly high proportion of common Pb and low closure temperature of U-Pb system of apatite inhibit its application as geochronological tool when other minerals such as zircon are widely available. However, zircon appear to be restricted to certain type of lunar rocks, carrying so called KREEP signature, whereas apatite (and whitlockite) is a common accessory mineral in the lunar samples. Therefore, utilizing apatite for lunar chronology may increase the pool of rocks that are available for U-Pb dating. The low stability of U-Pb systematics of apatite may also result in the resetting of the system during meteoritic bombardment, in which case apatite may provide an additional tool for the study of the impact history of the Moon. In order to investigate these possibilities, we have analysed apatites and zircons from two breccia samples collected during the Apollo 14 mission. Both samples were collected within the Fra Mauro formation, which is interpreted as a material ejected during the impact that formed the Imbrium Basin.

  2. SYRCLE’s risk of bias tool for animal studies

    PubMed Central

    2014-01-01

    Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063

  3. "Headache Tools to Stay in School": Assessment, Development, and Implementation of an Educational Guide for School Nurses

    ERIC Educational Resources Information Center

    Lazdowsky, Lori; Rabner, Jonathan; Caruso, Alessandra; Kaczynski, Karen; Gottlieb, Sarah; Mahoney, Elyse; LeBel, Alyssa

    2016-01-01

    Background: Headache is the most common type of pain reported in the pediatric population, and chronic headache is an increasingly prevalent and debilitating pain condition in children and adolescents. With large numbers of students experiencing acute headaches and more students with chronic headache reentering typical school settings, greater…

  4. The National Map - Geographic Names

    USGS Publications Warehouse

    ,

    2002-01-01

    Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.

  5. The National Map - Orthoimagery

    USGS Publications Warehouse

    ,

    2002-01-01

    Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.

  6. Enhancing Student Learning in Marketing Courses: An Exploration of Fundamental Principles for Website Platforms

    ERIC Educational Resources Information Center

    Hollenbeck, Candice R.; Mason, Charlotte H.; Song, Ji Hee

    2011-01-01

    The design of a course has potential to help marketing students achieve their learning objectives. Marketing courses are increasingly turning to technology to facilitate teaching and learning, and pedagogical tools such as Blackboard, WebCT, and e-Learning Commons are essential to the design of a course. Here, the authors investigate the research…

  7. Immunological Tools: Engaging Students in the Use and Analysis of Flow Cytometry and Enzyme-linked Immunosorbent Assay (ELISA)

    ERIC Educational Resources Information Center

    Ott, Laura E.; Carson, Susan

    2014-01-01

    Flow cytometry and enzyme-linked immunosorbent assay (ELISA) are commonly used techniques associated with clinical and research applications within the immunology and medical fields. The use of these techniques is becoming increasingly valuable in many life science and engineering disciplines as well. Herein, we report the development and…

  8. Wiki Use that Increases Communication and Collaboration Motivation

    ERIC Educational Resources Information Center

    Davidson, Robyn

    2012-01-01

    Communication and collaboration can be readily enabled by the use of many ICT tools. A wiki, which is an easily accessible and editable website, is one such platform that provides the opportunity for students to work on group projects without the barriers that arise from traditional group work. Whilst wiki use is becoming more common, its use in…

  9. The National Map - Elevation

    USGS Publications Warehouse

    ,

    2002-01-01

    Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.

  10. The National Map - Hydrography

    USGS Publications Warehouse

    ,

    2002-01-01

    Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.

  11. Oak regeneration potential increased by shelterwood treatments

    Treesearch

    Richard C. Schlesinger; Ivan L. Sander; Kenneth R. Davidson

    1993-01-01

    In much of the Central Hardwood Forest Region, oak species are not regenerating well, even though large oak trees are common within the existing forests. The shelterwood method has been suggested as a potential tool for establishing and developing advanced regeneration where it is lacking. The 10-yr results from a study of several variants of the shelterwood method...

  12. Lowering Blood Alcohol Content Levels to Save Lives: The European Experience

    ERIC Educational Resources Information Center

    Albalate, Daniel

    2008-01-01

    Road safety is of increasing concern in developed countries because of the significant number of deaths and large economic losses. One tool commonly used by governments to deal with road accidents is the enactment of stricter policies and regulations. Drunk driving is one of the leading concerns in this field and several European countries have…

  13. Assessing and Managing Quality of Information Assurance

    DTIC Science & Technology

    2010-11-01

    such as firewalls, antivirus scanning tools and mechanisms for user authentication and authorization. Advanced mission-critical systems often...imply increased risk to DoD information systems. The Process and Organizational Maturity (POM) class focuses on the maturity of the software and...include architectural quality. Common Weakness Enumeration (CWE) is a recent example that highlights the connection between software quality and

  14. Model-Based Reasoning: Using Visual Tools to Reveal Student Learning

    ERIC Educational Resources Information Center

    Luckie, Douglas; Harrison, Scott H.; Ebert-May, Diane

    2011-01-01

    Using visual models is common in science and should become more common in classrooms. Our research group has developed and completed studies on the use of a visual modeling tool, the Concept Connector. This modeling tool consists of an online concept mapping Java applet that has automatic scoring functions we refer to as Robograder. The Concept…

  15. Intra-grain Common Pb Correction and Detrital Apatite U-Pb Dating via LA-ICPMS Depth Profiling

    NASA Astrophysics Data System (ADS)

    Boyd, P. D.; Galster, F.; Stockli, D. F.

    2017-12-01

    Apatite is a common accessory phase in igneous and sedimentary rocks. While apatite is widely employed as a low-temperature thermochronometric tool, it has been increasingly utilized to constrain moderate temperature cooling histories by U-Pb dating. Apatite U-Pb is characterized by a thermal sensitivity window of 375-550°C. This unique temperature window recorded by the apatite U-Pb system, and the near-ubiquitous presence of apatite in igneous and clastic sedimentary rocks makes it a powerful tool able to illuminate mid-crustal tectono-thermal processes. However, as apatite incorporates only modest amounts of U and Th (1-10s of ppm) the significant amounts of non-radiogenic "common" Pb incorporated during its formation presents a major hurdle for apatite U-Pb dating. In bedrock samples common Pb in apatite can be corrected for by the measurement of Pb in a cogenetic mineral phase, such as feldspar, that does not incorporate U or from determination of a common Pb composition from multiple analyses in Tera-Wasserburg space. While these methods for common Pb correction in apatite can work for igneous samples, they cannot be applied to detrital apatite in sedimentary rocks with variable common Pb compositions. The obstacle of common Pb in apatite has hindered the application of detrital apatite U-Pb dating in provenance studies, despite the fact that it would be a powerful tool. This study presents a new method for the in situ correction of common Pb in apatite through the utilization of novel LA-ICP-MS depth profiling, which can recover U-Pb ratios at micron-scale spatial resolution during ablation of a grain. Due to the intra-grain U variability in apatite, a mixing line for a single grain can be generated in Tera-Wasserburg Concordia space. As a case study, apatite from a Variscan alpine granite were analyzed using both the single and multi-grain method, with both methods giving identical results. As a second case study the intra-grain method was then performed on detrital apatite from the Swiss Northern Alpine Foreland Basin, where the common Pb composition and age spectra of detrital apatite grains were elucidated. The novel intra-grain apatite method enables the correction for common Pb in detrital apatite, making it feasible to incorporate detrital apatite U-Pb dating in provenance and source-to-sink studies.

  16. Multi Modal Anticipation in Fuzzy Space

    NASA Astrophysics Data System (ADS)

    Asproth, Viveca; Holmberg, Stig C.; Hâkansson, Anita

    2006-06-01

    We are all stakeholders in the geographical space, which makes up our common living and activity space. This means that a careful, creative, and anticipatory planning, design, and management of that space will be of paramount importance for our sustained life on earth. Here it is shown that the quality of such planning could be significantly increased with help of a computer based modelling and simulation tool. Further, the design and implementation of such a tool ought to be guided by the conceptual integration of some core concepts like anticipation and retardation, multi modal system modelling, fuzzy space modelling, and multi actor interaction.

  17. Travel During Pregnancy: Considerations for the Obstetric Provider.

    PubMed

    Antony, Kathleen M; Ehrenthal, Deborah; Evensen, Ann; Iruretagoyena, J Igor

    2017-02-01

    Travel among US citizens is becoming increasingly common, and travel during pregnancy is also speculated to be increasingly common. During pregnancy, the obstetric provider may be the first or only clinician approached with questions regarding travel. In this review, we discuss the reasons women travel during pregnancy, medical considerations for long-haul air travel, destination-specific medical complications, and precautions for pregnant women to take both before travel and while abroad. To improve the quality of pretravel counseling for patients before or during pregnancy, we have created 2 tools: a guide for assessing the pregnant patient's risk during travel and a pretravel checklist for the obstetric provider. A PubMed search for English-language publications about travel during pregnancy was performed using the search terms "travel" and "pregnancy" and was limited to those published since the year 2000. Studies on subtopics were not limited by year of publication. Eight review articles were identified. Three additional studies that analyzed data from travel clinics were found, and 2 studies reported on the frequency of international travel during pregnancy. Additional publications addressed air travel during pregnancy (10 reviews, 16 studies), high-altitude travel during pregnancy (5 reviews, 5 studies), and destination-specific illnesses in pregnant travelers. Travel during pregnancy including international travel is common. Pregnant travelers have unique travel-related and destination-specific risks. We review those risks and provide tools for obstetric providers to use in counseling pregnant travelers.

  18. Comprehensive Shoulder US Examination: A Standardized Approach with Multimodality Correlation for Common Shoulder Disease

    PubMed Central

    Sheehan, Scott E.; Orwin, John F.; Lee, Kenneth S.

    2016-01-01

    Shoulder pain is one of the most common musculoskeletal conditions encountered in primary care and specialty orthopedic clinic settings. Although magnetic resonance (MR) imaging is typically the modality of choice for evaluating the soft-tissue structures of the shoulder, ultrasonography (US) is becoming an important complementary imaging tool in the evaluation of superficial soft-tissue structures such as the rotator cuff, subacromial-subdeltoid bursa, and biceps tendon. The advantages of US driving its recent increased use include low cost, accessibility, and capability for real-time high-resolution imaging that enables dynamic assessment and needle guidance. As more radiologists are considering incorporating shoulder US into their practices, the development of a standardized approach to performing shoulder US should be a priority to facilitate the delivery of high-quality patient care. Familiarity with and comfort in performing a standardized shoulder US examination, as well as knowledge of the types of anomalies that can be evaluated well with US, will enhance the expertise of those working in musculoskeletal radiology practices and add value in the form of increased patient and health care provider satisfaction. This review describes the utility and benefits of shoulder US as a tool that complements MR imaging in the assessment of shoulder pain. A standardized approach to the shoulder US examination is also described, with a review of the basic technique of this examination, normal anatomy of the shoulder, common indications for shoulder US, and characteristic US findings of common shoulder diseases—with select MR imaging and arthroscopic correlation. Online supplemental material is available for this article. ©RSNA, 2016 PMID:27726738

  19. The development of a digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey Lindsay

    Instructors in electrical and computer engineering and in computer science have developed innovative methods to teach digital logic circuits. These methods attempt to increase student learning, satisfaction, and retention. Although there are readily accessible and accepted means for measuring satisfaction and retention, there are no widely accepted means for assessing student learning. Rigorous assessment of learning is elusive because differences in topic coverage, curriculum and course goals, and exam content prevent direct comparison of two teaching methods when using tools such as final exam scores or course grades. Because of these difficulties, computing educators have issued a general call for the adoption of assessment tools to critically evaluate and compare the various teaching methods. Science, Technology, Engineering, and Mathematics (STEM) education researchers commonly measure students' conceptual learning to compare how much different pedagogies improve learning. Conceptual knowledge is often preferred because all engineering courses should teach a fundamental set of concepts even if they emphasize design or analysis to different degrees. Increasing conceptual learning is also important, because students who can organize facts and ideas within a consistent conceptual framework are able to learn new information quickly and can apply what they know in new situations. If instructors can accurately assess their students' conceptual knowledge, they can target instructional interventions to remedy common problems. To properly assess conceptual learning, several researchers have developed concept inventories (CIs) for core subjects in engineering sciences. CIs are multiple-choice assessment tools that evaluate how well a student's conceptual framework matches the accepted conceptual framework of a discipline or common faulty conceptual frameworks. We present how we created and evaluated the digital logic concept inventory (DLCI).We used a Delphi process to identify the important and difficult concepts to include on the DLCI. To discover and describe common student misconceptions, we interviewed students who had completed a digital logic course. Students vocalized their thoughts as they solved digital logic problems. We analyzed the interview data using a qualitative grounded theory approach. We have administered the DLCI at several institutions and have checked the validity, reliability, and bias of the DLCI with classical testing theory procedures. These procedures consisted of follow-up interviews with students, analysis of administration results with statistical procedures, and expert feedback. We discuss these results and present the DLCI's potential for providing a meaningful tool for comparing student learning at different institutions.

  20. Development of an Ergonomics Checklist for Investigation of Work-Related Whole-Body Disorders in Farming - AWBA: Agricultural Whole-Body Assessment.

    PubMed

    Kong, Y K; Lee, S J; Lee, K S; Kim, G R; Kim, D M

    2015-10-01

    Researchers have been using various ergonomic tools to study occupational musculoskeletal diseases in industrial contexts. However, in agricultural work, where the work environment is poorer and the socio-psychological stress is high due to the high labor intensities of the industry, current research efforts have been scarce, and the number of available tools is small. In our preliminary studies, which focused on a limited number of body parts and other working elements, we developed separate evaluation tools for the upper and lower extremities. The current study was conducted to develop a whole-body ergonomic assessment tool for agricultural work that integrates the existing assessment tools for lower and upper extremities developed in the preliminary studies and to verify the relevance of the integrated assessment tool. To verify the relevance of the Agricultural Whole-Body Assessment (AWBA) tool, we selected 50 different postures that occur frequently in agricultural work. Our results showed that the AWBA-determined risk levels were similar to the subjective risk levels determined by experts. In addition, as the risk level increased, the average risk level increased to a similar extent. Moreover, the differences in risk levels between the AWBA and expert assessments were mostly smaller than the differences in risk levels between other assessment tools and the expert assessments in this study. In conclusion, the AWBA tool developed in this study was demonstrated to be appropriate for use as a tool for assessing various postures commonly assumed in agricultural work. Moreover, we believe that our verification of the assessment tools will contribute to the enhancement of the quality of activities designed to prevent and control work-related musculoskeletal diseases in other industries.

  1. Beta thalassemia in 31,734 cases with HBB gene mutations: Pathogenic and structural analysis of the common mutations; Iran as the crossroads of the Middle East.

    PubMed

    Mahdieh, Nejat; Rabbani, Bahareh

    2016-11-01

    Thalassemia is one of the most common single gene disorders worldwide. Nearly 80 to 90 million with minor beta thalassemia and 60-70 thousand affected infants are born annually worldwide. A comprehensive search on several databases including PubMed, InterScience, British Library Direct, and Science Direct was performed extracting papers about mutation detection and frequency of beta thalassemia. All papers reporting on the mutation frequency of beta thalassemia patients were selected to analyze the frequency of mutations in different regions and various ethnicities. Mutations of 31,734 individuals were identified. Twenty common mutations were selected for further analysis. Genotype-phenotype correlation, interactome, and in silico analyses of the mutations were performed using available bioinformatics tools. Secondary structure prediction was achieved for two common mutations with online tools. The mutations were also common among the countries neighboring Iran, which are responsible for 71% to 98% of mutations. Computational analyses could be used in addition to segregation and expression analysis to assess the extent of pathogenicity of the variant. The genetics of beta thalassemia in Iran is more extensively heterogeneous than in neighboring countries. Some common mutations have arisen historically from Iran and moved to other populations due to population migrations. Also, due to genetic drift, the frequencies of some mutations have increased in small populations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Revisiting "The Master's Tools": Challenging Common Sense in Cross-Cultural Teacher Education

    ERIC Educational Resources Information Center

    Chinnery, Ann

    2008-01-01

    According to Kevin Kumashiro (2004), education toward a socially just society requires a commitment to challenge common sense notions or assumptions about the world and about teaching and learning. Recalling Audre Lorde's (1984) classic essay, "The Master's Tools Will Never Dismantle the Master's House," I focus on three common sense notions and…

  3. Drivers for animal welfare policies in Europe.

    PubMed

    Dalla Villa, P; Matthews, L R; Alessandrini, B; Messori, S; Migliorati, G

    2014-04-01

    The European region has been, and remains, a global leader in the development of animal welfare policies. The region has a great diversity of cultures and religions, different levels of socio-economic development, and varied legislation, policies and practices. Nevertheless, there are common drivers for animal welfare policy based on a history of animal welfare ethics and obligations to animal users and society in general. A unifying goal of countries in the region is to achieve sustainable compliance with the World Organisation for Animal Health (OIE) standards on animal health and welfare. Ethics isthe overarching driver, supported by the actions of governmental, inter-governmental and non-governmental activities, markets and trade, science and knowledge. Historically, organisations involved in promoting animal welfare have tended to act in isolation. For example, non-governmental organisations (NGOs) have run campaigns to influence retailers and the welfare policies of their farmer suppliers. Increasingly, different organisations with common or complementary goals are working together. For example, competent authorities, inter-governmental bodies and NGOs have combined their efforts to address dog population control across several countries in the region. Also, animal welfare is becoming integrated into the corporate social responsibility targets of private companies. Science and knowledge, as drivers and tools, are assisting with the harmonisation of welfare standards, e.g. by providing a common basis for measuring welfare impacts through animal-based measures and widespread sharing of this information. Current trends suggest that there will be greater collaboration among the organisations driving change, and increasing convergence of animal welfare strategies and welfare assessment tools. The result will be increased harmonisation of animal welfare standards throughout the region.

  4. Predicting performance with traffic analysis tools : final report.

    DOT National Transportation Integrated Search

    2008-03-01

    This document provides insights into the common pitfalls and challenges associated with use of traffic analysis tools for predicting future performance of a transportation facility. It provides five in-depth case studies that demonstrate common ways ...

  5. Establishment of the Ivermectin Research for Malaria Elimination Network: updating the research agenda.

    PubMed

    Chaccour, Carlos J; Rabinovich, N Regina; Slater, Hannah; Canavati, Sara E; Bousema, Teun; Lacerda, Marcus; Ter Kuile, Feiko; Drakeley, Chris; Bassat, Quique; Foy, Brian D; Kobylinski, Kevin

    2015-06-11

    The potential use of ivermectin as an additional vector control tool is receiving increased attention from the malaria elimination community, driven by the increased importance of outdoor/residual malaria transmission and the threat of insecticide resistance where vector tools have been scaled-up. This report summarizes the emerging evidence presented at a side meeting on "Ivermectin for malaria elimination: current status and future directions" at the annual meeting of the American Society of Tropical Medicine and Hygiene in New Orleans on November 4, 2014. One outcome was the creation of the "Ivermectin Research for Malaria Elimination Network" whose main goal is to establish a common research agenda to generate the evidence base on whether ivermectin-based strategies should be added to the emerging arsenal to interrupt malaria transmission.

  6. Knowing the operative game plan: a novel tool for the assessment of surgical procedural knowledge.

    PubMed

    Balayla, Jacques; Bergman, Simon; Ghitulescu, Gabriela; Feldman, Liane S; Fraser, Shannon A

    2012-08-01

    What is the source of inadequate performance in the operating room? Is it a lack of technical skills, poor judgment or a lack of procedural knowledge? We created a surgical procedural knowledge (SPK) assessment tool and evaluated its use. We interviewed medical students, residents and training program staff on SPK assessment tools developed for 3 different common general surgery procedures: inguinal hernia repair with mesh in men, laparoscopic cholecystectomy and right hemicolectomy. The tools were developed as a step-wise assessment of specific surgical procedures based on techniques described in a current surgical text. We compared novice (medical student to postgraduate year [PGY]-2) and expert group (PGY-3 to program staff) scores using the Mann-Whitney U test. We calculated the total SPK score and defined a cut-off score using receiver operating characteristic analysis. In all, 5 participants in 7 different training groups (n = 35) underwent an interview. Median scores for each procedure and overall SPK scores increased with experience. The median SPK for novices was 54.9 (95% confidence interval [CI] 21.6-58.8) compared with 98.05 (95% CP 94.1-100.0) for experts (p = 0.012). The SPK cut-off score of 93.1 discriminates between novice and expert surgeons. Surgical procedural knowledge can reliably be assessed using our SPK assessment tool. It can discriminate between novice and expert surgeons for common general surgical procedures. Future studies are planned to evaluate its use for more complex procedures.

  7. Comparing 2 National Organization-Level Workplace Health Promotion and Improvement Tools, 2013–2015

    PubMed Central

    Lang, Jason E.; Davis, Whitney D.; Jones-Jack, Nkenge H.; Mukhtar, Qaiser; Lu, Hua; Acharya, Sushama D.; Molloy, Meg E.

    2016-01-01

    Creating healthy workplaces is becoming more common. Half of employers that have more than 50 employees offer some type of workplace health promotion program. Few employers implement comprehensive evidence-based interventions that reach all employees and achieve desired health and cost outcomes. A few organization-level assessment and benchmarking tools have emerged to help employers evaluate the comprehensiveness and rigor of their health promotion offerings. Even fewer tools exist that combine assessment with technical assistance and guidance to implement evidence-based practices. Our descriptive analysis compares 2 such tools, the Centers for Disease Control and Prevention’s Worksite Health ScoreCard and Prevention Partners’ WorkHealthy America, and presents data from both to describe workplace health promotion practices across the United States. These tools are reaching employers of all types (N = 1,797), and many employers are using a comprehensive approach (85% of those using WorkHealthy America and 45% of those using the ScoreCard), increasing program effectiveness and impact. PMID:27685429

  8. Continued progress in the prevention of nail gun injuries among apprentice carpenters: what will it take to see wider spread injury reductions?

    PubMed

    Lipscomb, Hester J; Nolan, James; Patterson, Dennis; Dement, John M

    2010-06-01

    Nail guns are a common source of acute, and potentially serious, injury in residential construction. Data on nail gun injuries, hours worked and hours of tool use were collected in 2008 from union apprentice carpenters (n=464) through classroom surveys; this completed four years of serial cross-sectional data collection from apprentices. A predictive model of injury risk was constructed using Poisson regression. Injury rates declined 55% from baseline measures in 2005 with early training and increased use of tools with sequential actuation. Injury rates declined among users of tools with both actuation systems, but the rates of injury were consistently twice as high among those using tools with contact trip triggers. DISCUSSION AND IMPACT: Nail gun injuries can be reduced markedly through early training and use of tools with sequential actuation. These successful efforts need to be diffused broadly, including to the non-union sector. (c) 2010 Elsevier Ltd. All rights reserved.

  9. An experimental study of flank wear in the end milling of AISI 316 stainless steel with coated carbide inserts

    NASA Astrophysics Data System (ADS)

    Odedeyi, P. B.; Abou-El-Hossein, K.; Liman, M.

    2017-05-01

    Stainless steel 316 is a difficult-to-machine iron-based alloys that contain minimum of about 12% of chromium commonly used in marine and aerospace industry. This paper presents an experimental study of the tool wear propagation variations in the end milling of stainless steel 316 with coated carbide inserts. The milling tests were conducted at three different cutting speeds while feed rate and depth of cut were at (0.02, 0.06 and 01) mm/rev and (1, 2 and 3) mm, respectively. The cutting tool used was TiAlN-PVD-multi-layered coated carbides. The effects of cutting speed, cutting tool coating top layer and workpiece material were investigated on the tool life. The results showed that cutting speed significantly affected the machined flank wears values. With increasing cutting speed, the flank wear values decreased. The experimental results showed that significant flank wear was the major and predominant failure mode affecting the tool life.

  10. Comparative analysis and visualization of multiple collinear genomes

    PubMed Central

    2012-01-01

    Background Genome browsers are a common tool used by biologists to visualize genomic features including genes, polymorphisms, and many others. However, existing genome browsers and visualization tools are not well-suited to perform meaningful comparative analysis among a large number of genomes. With the increasing quantity and availability of genomic data, there is an increased burden to provide useful visualization and analysis tools for comparison of multiple collinear genomes such as the large panels of model organisms which are the basis for much of the current genetic research. Results We have developed a novel web-based tool for visualizing and analyzing multiple collinear genomes. Our tool illustrates genome-sequence similarity through a mosaic of intervals representing local phylogeny, subspecific origin, and haplotype identity. Comparative analysis is facilitated through reordering and clustering of tracks, which can vary throughout the genome. In addition, we provide local phylogenetic trees as an alternate visualization to assess local variations. Conclusions Unlike previous genome browsers and viewers, ours allows for simultaneous and comparative analysis. Our browser provides intuitive selection and interactive navigation about features of interest. Dynamic visualizations adjust to scale and data content making analysis at variable resolutions and of multiple data sets more informative. We demonstrate our genome browser for an extensive set of genomic data sets composed of almost 200 distinct mouse laboratory strains. PMID:22536897

  11. Tools of the Trade

    ERIC Educational Resources Information Center

    Arnold, Kathy

    2012-01-01

    This article outlines the author's efforts to build her knowledge of students' understandings of mathematics whilst catering for different abilities within a Year 1 classroom, using the freely available "Assessment for Common Misunderstandings tools." "The Assessment for Common Misunderstandings" materials have been…

  12. The role of health behavior in preventing dental caries in resource-poor adults: a pilot intervention.

    PubMed

    Wu, Andrew; Switzer-Nadasdi, Rhonda

    2014-01-01

    Dental caries is a highly prevalent, yet preventable disease that is commonly overlooked in the adult population. It is strongly related to health-related behaviors and knowledge, and therefore, is potentially receptive to a behavioral health intervention. However, prevention strategies that target health behaviors in adults are fundamentally different from those in children, whom most current intervention strategies for dental caries target. This study attempts to pilot design, implement, and assess health behavior intervention tools for adults, in order to improve their oral health. To increase knowledge about dental caries by 80% and increase positive self-reported oral hygiene behaviors by 80% in low-income adult participants at Interfaith Dental Clinic by piloting novel interventional and educational tools based on the Transtheoretical Model of Health Behavior. A convenience sample of newly registered participants to the Interfaith Dental Clinic between August 2011 and May 2013, were interviewed on each participant's first appointment, exposed to the interventional tools, and subsequently interviewed at their next appointment. A control group, comprised of participants who had completed their caries care as deemed by the clinic and had not been exposed to the interventional tools, were also interviewed on their last appointment before graduating the clinic's program. A total of 112 participants were exposed to the intervention, and forty-two participants comprised the control group. Follow-up for the intervention group was 20.5% (n = 23). Knowledge about the cause of caries increased by 29.9%, and positive self-reported oral hygiene behaviors increased by 25.4%. A Wilcoxon rank sum test showed no significance between the interview scores of the post-intervention group and that of the control group (p = 0.18 for knowledge, p = 0.284 for behaviors). Qualitative results show the vast majority of participants blamed diet for cause of caries, that this participant population prioritized practical advice over factual education, and that flossing was perceived to be the largest barrier to proper oral care, citing pain, lack of time, and technique as common reasons. Educational tools based on current models of health behavior theory have the potential to improve participant knowledge and health behaviors, while also remaining low-cost and convenient for clinical use.

  13. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  14. Web 2.0 Technologies for Effective Knowledge Management in Organizations: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Nath, Anupam Kumar

    2012-01-01

    A new generation of Internet-based collaborative tools, commonly known as Web 2.0, has increased in popularity, availability, and power in the last few years (Kane and Fichman, 2009). Web 2.0 is a set of Internet-based applications that harness network effects by facilitating collaborative and participative computing (O'Reilly, 2006).…

  15. Comparing Formative and Summative Instruments: What Tools Inform Practice and Guide Teacher Candidate Decision Making?

    ERIC Educational Resources Information Center

    Wilcoxen, Christina L.

    2017-01-01

    With an increased focus on field-based preparation, the relationship between P12 school districts and universities has been forced to change with little or no support to create effective third space environments. The complexity of the student teaching experience is compounded by the need for redefined roles, the lack of a common lexicon and the…

  16. Life Satisfaction in Persons with Schizophrenia Living in the Community: Validation of the Satisfaction with Life Scale

    ERIC Educational Resources Information Center

    Wu, Chia-Huei; Wu, Chin-Yu

    2008-01-01

    Subjective well-being is an increasingly common indicator of adequacy of psychiatric services. An easy-to-administer assessment tool of subjective well-being that is conceptually sound, valid, and reliable is needed for use in persons with schizophrenia. The purpose of this paper was to validate the 5-item Satisfaction with Life Scale…

  17. Modelling and Bibliotherapy as Tools to Enhance Pro-Social Interactions during English Language Arts Lessons with First Graders

    ERIC Educational Resources Information Center

    Nguyen, Neal; Lyons, Catherine; Gelfer, Jeff; Leytham, Patrick; Nelson, Leslie; Krasch, Delilah; O'Hara, Katie

    2016-01-01

    Play is one of the essential components in proper development of first-grade students. Since the adoption by various states of the Common Core State Standards (CCSS), two outcomes have developed: (a) increased instructional time and (b) decreased public school recess periods across school districts. Given the complex nature of daily instructional…

  18. Mathematics from High School to Community College: Using Existing Tools to Increase College-Readiness Now. Policy Brief 14-1

    ERIC Educational Resources Information Center

    Jaffe, Louise

    2014-01-01

    The adoption and implementation of the Common Core State Standards and Smarter Balanced assessments in mathematics are intended to provide all students in California with the knowledge and skills required to transition from high school to college-level coursework. This implementation will take time. Concurrent with these efforts, policymakers and…

  19. Effectiveness of fishing gears to assess fish assemblage size structure in small lake ecosystems

    Treesearch

    T. A. Clement; K. Pangle; D. G. Uzarski; B. A. Murry

    2014-01-01

    Measurement of fish body-size distributions is increasingly used as a management tool to assess fishery status. However, the effects of gear selection on observed fish size structure has not received sufficient attention. Four different gear types (experimental gill nets, fine mesh bag seine, and two different sized mesh trap nets), which are commonly employed in the...

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalavapudi, M.; Iyengar, V.

    Increased industrial activities in developing countries have degraded the environment, and the impact on the environment is further magnified because of an ever-increasing population, the prime receptors. Independent of the geographical location, it is possible to adopt effective strategies to solve environmental problems. In the United States, waste characterization and remediation practices are commonly used for quantifying toxic contaminants in air, water, and soil. Previously, such procedures were extraneous, ineffective, and cost-intensive. Reconciliation between the government and stakeholders, reinforced by valid data analysis and environmental exposure assessments, has allowed the {open_quotes}Brownfields{close_quotes} to be a successful approach. Certified reference materials andmore » standard reference materials from the National Institute of Standards (NIST) are indispensable tools for solving environmental problems and help to validate data quality and the demands of legal metrology. Certified reference materials are commonly available, essential tools for developing good quality secondary and in-house reference materials that also enhance analytical quality. This paper cites examples of environmental conditions in developing countries, i.e., industrial pollution problems in India, polluted beaches in Brazil, and deteriorating air quality in countries, such as Korea, China, and Japan. The paper also highlights practical and effective approaches for remediating these problems. 23 refs., 7 figs., 1 tab.« less

  1. Model Error Budgets

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    2008-01-01

    An error budget is a commonly used tool in design of complex aerospace systems. It represents system performance requirements in terms of allowable errors and flows these down through a hierarchical structure to lower assemblies and components. The requirements may simply be 'allocated' based upon heuristics or experience, or they may be designed through use of physics-based models. This paper presents a basis for developing an error budget for models of the system, as opposed to the system itself. The need for model error budgets arises when system models are a principle design agent as is increasingly more common for poorly testable high performance space systems.

  2. Characterizing Marine Soundscapes.

    PubMed

    Erbe, Christine; McCauley, Robert; Gavrilov, Alexander

    2016-01-01

    The study of marine soundscapes is becoming widespread and the amount of data collected is increasing rapidly. Data owners (typically academia, industry, government, and defense) are negotiating data sharing and generating potential for data syntheses, comparative studies, analyses of trends, and large-scale and long-term acoustic ecology research. A problem is the lack of standards and commonly agreed protocols for the recording of marine soundscapes, data analysis, and reporting that make a synthesis and comparison of results difficult. We provide a brief overview of the components in a marine soundscape, the hard- and software tools for recording and analyzing marine soundscapes, and common reporting formats.

  3. Understanding the influence of social media in medicine: lesson learned from Facebook.

    PubMed

    Savas, Jessica A; Huang, Karen E; Tuchayi, Sara Moradi; Feldman, Steven R

    2014-09-16

    Atopic dermatitis is a very common chronic skin disease. With increasing number of patients searching social media outlets such as Facebook for medical information, social media can be used by physicians as a powerful educational tool. We analyzed the unmoderated Q&A series on Facebook begun by members of National Eczema Association Scientific Advisory Committee. Four respondents accounted for more than 50% of all responses and the most common were negative posts about topical steroids (61%). Possible strategies to accomplish the safe dissemination of information in a public forum may include a moderator role for physicians.

  4. XML schemas for common bioinformatic data types and their application in workflow systems

    PubMed Central

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-01-01

    Background Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data – therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Results Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at , the BioDOM library can be obtained at . Conclusion The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios. PMID:17087823

  5. Keeping you at arm's length: modifying peripersonal space influences interpersonal distance.

    PubMed

    Quesque, F; Ruggiero, G; Mouta, S; Santos, J; Iachini, T; Coello, Y

    2017-07-01

    Peripersonal space represents the area around the body where objects are coded in motor terms for the purpose of voluntary goal-directed actions. Previous studies have suggested that peripersonal space is also a safe space linked with our private area, influencing interpersonal space in social contexts. However, whether these two spaces rely on similar embodied processes remains an open issue. In the present study, participants observed a point-light walker (PLW) approaching them from different directions and passing near them at different distances from their right or left shoulder. While approaching, the PLW disappeared at a distance of 2 m and the task for the participants was to estimate if the interpersonal distance, at the time the PLW would have reached their level, was comfortable or not. Between two sessions of comfort judgments, the participants manipulated a 70 cm tool entailing an extension of peripersonal space, or a 10 cm tool entailing no extension of peripersonal space. The results revealed that the comfortable interpersonal distance was larger when the PLW crossed the mid-sagittal plane of the participants than when it approached them laterally, with a concomitant increase of response time. After participants manipulated the long tool, comfortable interpersonal distance increased, but predominantly when the PLW trajectory implied crossing the participants' mid-sagittal plane. This effect was not observed when participants manipulated the short tool. Two control tasks showed that using the long tool modified the reachability (control 1), but not the time to passage (control 2) estimates of PLW stimuli, suggesting that tool use extended peripersonal space without changing perceived visual distances. Overall, the data show that comfortable interpersonal distance is linked to the representation of peripersonal space. As a consequence, increasing peripersonal space through tool use has the immediate consequence that comfortable interpersonal distance from another person also increases, suggesting that interpersonal-comfort space and peripersonal-reaching space share a common motor nature.

  6. GACT: a Genome build and Allele definition Conversion Tool for SNP imputation and meta-analysis in genetic association studies.

    PubMed

    Sulovari, Arvis; Li, Dawei

    2014-07-19

    Genome-wide association studies (GWAS) have successfully identified genes associated with complex human diseases. Although much of the heritability remains unexplained, combining single nucleotide polymorphism (SNP) genotypes from multiple studies for meta-analysis will increase the statistical power to identify new disease-associated variants. Meta-analysis requires same allele definition (nomenclature) and genome build among individual studies. Similarly, imputation, commonly-used prior to meta-analysis, requires the same consistency. However, the genotypes from various GWAS are generated using different genotyping platforms, arrays or SNP-calling approaches, resulting in use of different genome builds and allele definitions. Incorrect assumptions of identical allele definition among combined GWAS lead to a large portion of discarded genotypes or incorrect association findings. There is no published tool that predicts and converts among all major allele definitions. In this study, we have developed a tool, GACT, which stands for Genome build and Allele definition Conversion Tool, that predicts and inter-converts between any of the common SNP allele definitions and between the major genome builds. In addition, we assessed several factors that may affect imputation quality, and our results indicated that inclusion of singletons in the reference had detrimental effects while ambiguous SNPs had no measurable effect. Unexpectedly, exclusion of genotypes with missing rate > 0.001 (40% of study SNPs) showed no significant decrease of imputation quality (even significantly higher when compared to the imputation with singletons in the reference), especially for rare SNPs. GACT is a new, powerful, and user-friendly tool with both command-line and interactive online versions that can accurately predict, and convert between any of the common allele definitions and between genome builds for genome-wide meta-analysis and imputation of genotypes from SNP-arrays or deep-sequencing, particularly for data from the dbGaP and other public databases. http://www.uvm.edu/genomics/software/gact.

  7. Development of a questionnaire for assessing factors predicting blood donation among university students: a pilot study.

    PubMed

    Jalalian, Mehrdad; Latiff, Latiffah; Hassan, Syed Tajuddin Syed; Hanachi, Parichehr; Othman, Mohamed

    2010-05-01

    University students are a target group for blood donor programs. To develop a blood donation culture among university students, it is important to identify factors used to predict their intent to donate blood. This study attempted to develop a valid and reliable measurement tool to be employed in assessing variables in a blood donation behavior model based on the Theory of Planned Behavior (TPB), a commonly used theoretical foundation for social psychology studies. We employed an elicitation study, in which we determined the commonly held behavioral and normative beliefs about blood donation. We used the results of the elicitation study and a standard format for creating questionnaire items for all constructs of the TPB model to prepare the first draft of the measurement tool. After piloting the questionnaire, we prepared the final draft of the questionnaire to be used in our main study. Examination of internal consistency using Chronbach's alpha coefficient and item-total statistics indicated the constructs "Intention" and "Self efficacy" had the highest reliability. Removing one item from each of the constructs, "Attitude," "Subjective norm," "Self efficacy," or "Behavioral beliefs", can considerably increase the reliability of the measurement tool, however, such action is controversial, especially for the variables "attitude" and "subjective norm." We consider all the items of our first draft questionnaire in our main study to make it a reliable measurement tool.

  8. Assessment of nutritional status in laparoscopic gastrectomy for gastric cancer.

    PubMed

    Son, Young-Gil; Kwon, In Gyu; Ryu, Seung Wan

    2017-01-01

    Malnutrition is very common in gastric cancer patients and can be detected in up to 85% of patients with gastric cancer. Malnutrition is associated with increased morbidity and mortality, prolonged hospital stay, poor treatment tolerance, and lower survival rate. Malnutrition also has an impact on quality of life. The early detection of nutritional risk with appropriate nutritional care can significantly reduce patient's postoperative morbidity and mortality. Because there is no gold standard tool, appropriate tools should be selected and applied depending on one's institutional conditions. And, it is recommended that nutritional assessment should be achieved for every patient at pre/post-operative period.

  9. A web-based tool to predict acute kidney injury in patients with ST-elevation myocardial infarction: Development, internal validation and comparison.

    PubMed

    Zambetti, Benjamin R; Thomas, Fridtjof; Hwang, Inyong; Brown, Allen C; Chumpia, Mason; Ellis, Robert T; Naik, Darshan; Khouzam, Rami N; Ibebuogu, Uzoma N; Reed, Guy L

    2017-01-01

    In ST-elevation myocardial infarction (STEMI), acute kidney injury (AKI) may increase subsequent morbidity and mortality. Still, it remains difficult to predict AKI risk in these patients. We sought to 1) determine the frequency and clinical outcomes of AKI and, 2) develop, validate and compare a web-based tool for predicting AKI. In a racially diverse series of 1144 consecutive STEMI patients, Stage 1 or greater AKI occurred in 12.9% and was severe (Stage 2-3) in 2.9%. AKI was associated with increased mortality (5.7-fold, unadjusted) and hospital stay (2.5-fold). AKI was associated with systolic dysfunction, increased left ventricular end-diastolic pressures, hypotension and intra-aortic balloon counterpulsation. A computational algorithm (UT-AKI) was derived and internally validated. It showed higher sensitivity and improved overall prediction for AKI (area under the curve 0.76) vs. other published indices. Higher UT-AKI scores were associated with more severe AKI, longer hospital stay and greater hospital mortality. In a large, racially diverse cohort of STEMI patients, Stage 1 or greater AKI was relatively common and was associated with significant morbidity and mortality. A web-accessible, internally validated tool was developed with improved overall value for predicting AKI. By identifying patients at increased risk, this tool may help physicians tailor post-procedural diagnostic and therapeutic strategies after STEMI to reduce AKI and its associated morbidity and mortality.

  10. Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites

    NASA Technical Reports Server (NTRS)

    Culver, Michael R.; Soong, Christine; Warner, Joseph D.

    2014-01-01

    In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.

  11. Freva - Freie Univ Evaluation System Framework for Scientific Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe

    2016-04-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  12. Breaking common ground: Scalar perceptions and the effects on energy extraction in Pavillion, Wyoming

    NASA Astrophysics Data System (ADS)

    Watts, Kaitlyn

    Conflicts over natural resources are increasing throughout the world. Researchers have taken the geographic concept of scale and applied it as a tool for analyzing environmental conflict and determining the correct jurisdictional arena for regulation. My research takes this social construction of scale and applies it to a case study of energy extraction in Pavillion, Wyoming. The case study focuses on the conflict that developed over hydraulic fracturing and water contamination at a time when the use of hydraulic fracturing increased nationwide. Through the use of personal interviews and document analysis I determine the ways that stakeholders use scale in the conflict to influence the strategies that they use to persuade policy decisions. This provides an example of how scale can be used as an effective tool of policy analysis and environmental conflict resolution

  13. Introducing Students to Gas Chromatography-Mass Spectrometry Analysis and Determination of Kerosene Components in a Complex Mixture

    ERIC Educational Resources Information Center

    Pacot, Giselle Mae M.; Lee, Lyn May; Chin, Sung-Tong; Marriott, Philip J.

    2016-01-01

    Gas chromatography-mass spectrometry (GC-MS) and GC-tandem MS (GC-MS/MS) are useful in many separation and characterization procedures. GC-MS is now a common tool in industry and research, and increasingly, GC-MS/MS is applied to the measurement of trace components in complex mixtures. This report describes an upper-level undergraduate experiment…

  14. Practice tip. Producing newsletters - criteria for success.

    PubMed

    Eckermann, Sarah Louise; McIntyre, Ellen; Magarey, Anne

    2007-01-01

    Newsletters, while informal when compared to peer reviewed journals and organisational reports, are a popular communication tool used to provide and share information, for and about people with a common interest or need. Newsletters are an effective way to build networks and improve communication. As the editors of several newsletters in primary health care, the authors have developed criteria that aim to increase the success of a newsletter.

  15. Institutional and Technological Barriers to the Use of Open Educational Resources (OERs) in Physiology and Medical Education

    ERIC Educational Resources Information Center

    Hassall, Christopher; Lewis, David I.

    2017-01-01

    Open educational resources (OERs) are becoming increasingly common as a tool in education, particularly in medical and biomedical education. However, three key barriers have been identified to their use: 1) lack of awareness of OERs, 2) lack of motivation to use OERs, and 3) lack of training in the use of OERs. Here, we explore these three…

  16. Prevalence, pattern, and factors associated with work-related musculoskeletal disorders among pluckers in a tea plantation in Tamil Nadu, India

    PubMed Central

    Vasanth, Deepthi; Ramesh, Naveen; Fathima, Farah Naaz; Fernandez, Ria; Jennifer, Steffi; Joseph, Bobby

    2015-01-01

    Context: Musculoskeletal pain is common among tea leaf pluckers and is attributed to the load they carry, long working hours, the terrain, and insufficient job rotations. As a result of this, their health and work capacity are affected. Aims: To assess the prevalence, patterns, and factors associated with work-related musculoskeletal disorders (WRMDs) among pluckers in a tea plantation in Annamalai, Tamil Nadu, India. Settings and Design: This cross-sectional study surveyed 195 pluckers selected by simple random sampling aged between 18 years and 60 years. Materials and Methods: The interview schedule had four parts––sociodemographic detail, Standard Nordic Scale, numeric and facial pain rating tool, and a tool to assess factors associated with WRMDs. Statistical Analysis Used: Statistical Package for the Social Sciences (SPSS) version 16. Results: Prevalence of musculoskeletal pain in the last 12 months and the last 7 days was 83.6% and 78.5%, respectively. The most common site for last 1 year was shoulder (59%) and for last 7 days was the lower back (52.8%). Independent t-test revealed that the mean age of those with pain was 6.59 year more and mean years of employment was 1.38 years more among the workers with pain compared to workers without pain. Increasing morbidities among workers was also significantly associated with an increase in WRMDs on Chi-square test. Conclusions: The prevalence of musculoskeletal pain was high among tea pluckers and the most common site during the last 12 months and the last 7 days was the shoulder and lower back respectively was mild in character. Increase in age and duration of employment was associated with WRMDs. PMID:26957816

  17. Conversion of National Health Insurance Service-National Sample Cohort (NHIS-NSC) Database into Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM).

    PubMed

    You, Seng Chan; Lee, Seongwon; Cho, Soo-Yeon; Park, Hojun; Jung, Sungjae; Cho, Jaehyeong; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    It is increasingly necessary to generate medical evidence applicable to Asian people compared to those in Western countries. Observational Health Data Sciences a Informatics (OHDSI) is an international collaborative which aims to facilitate generating high-quality evidence via creating and applying open-source data analytic solutions to a large network of health databases across countries. We aimed to incorporate Korean nationwide cohort data into the OHDSI network by converting the national sample cohort into Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM). The data of 1.13 million subjects was converted to OMOP-CDM, resulting in average 99.1% conversion rate. The ACHILLES, open-source OMOP-CDM-based data profiling tool, was conducted on the converted database to visualize data-driven characterization and access the quality of data. The OMOP-CDM version of National Health Insurance Service-National Sample Cohort (NHIS-NSC) can be a valuable tool for multiple aspects of medical research by incorporation into the OHDSI research network.

  18. Tribological performances of new steel grades for hot stamping tools

    NASA Astrophysics Data System (ADS)

    Medea, F.; Venturato, G.; Ghiotti, A.; Bruschi, S.

    2017-09-01

    In the last years, the use of High Strength Steels (HSS) as structural parts in car body-in-white manufacturing has rapidly increased thanks to their favourable strength-to-weight ratio and stiffness, which allow a reduction of the fuel consumption to accommodate the new restricted regulations for CO2 emissions control. The survey of the technical and scientific literature shows a large interest in the development of different coatings for the blanks from the traditional Al-Si up to new Zn-based coatings and on the analysis of hard PVD, CVD coatings and plasma nitriding applied on the tools. By contrast, fewer investigations have been focused on the development and test of new tools steels grades capable to improve the wear resistance and the thermal properties that are required for the in-die quenching during forming. On this base, the paper deals with the analysis and comparison the tribological performances in terms of wear, friction and heat transfer of new tool steel grades for high-temperature applications, characterized by a higher thermal conductivity than the commonly used tools. Testing equipment, procedures as well as measurements analyses to evaluate the friction coefficient, the wear and heat transfer phenomena are presented. Emphasis is given on the physical simulation techniques that were specifically developed to reproduce the thermal and mechanical cycles on the metal sheets and dies as in the industrial practice. The reference industrial process is the direct hot stamping of the 22MnB5 HSS coated with the common Al-Si coating for automotive applications.

  19. XML schemas for common bioinformatic data types and their application in workflow systems.

    PubMed

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-11-06

    Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data--therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at http://bioschemas.sourceforge.net, the BioDOM library can be obtained at http://biodom.sourceforge.net. The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios.

  20. The use of Skype in a community hospital inpatient palliative medicine consultation service.

    PubMed

    Brecher, David B

    2013-01-01

    Skype™, an Internet-based communication tool, has enhanced communication under numerous circumstances. As telemedicine continues to be an increasing part of medical practice, there will be more opportunities to use Skype and similar tools. Numerous scenarios in the lay literature have helped to highlight the potential uses. Although most commonly used to enhance physician-to-patient communication, there has been limited reported use of Skype for patient-to-family communication, especially in end of life and palliative care. Our inpatient Palliative Medicine Consultation Service has offered and used this technology to enhance our patients' quality of life. The objective was to provide another tool for our patients to use to communicate with family and/or friends, especially under circumstances in which clinical symptoms, functional status, financial concerns, or geographic limitations preclude in-person face-to face communication.

  1. Learnings From the Pilot Implementation of Mobile Medical Milestones Application.

    PubMed

    Page, Cristen P; Reid, Alfred; Coe, Catherine L; Carlough, Martha; Rosenbaum, Daryl; Beste, Janalynn; Fagan, Blake; Steinbacher, Erika; Jones, Geoffrey; Newton, Warren P

    2016-10-01

    Implementation of the educational milestones benefits from mobile technology that facilitates ready assessments in the clinical environment. We developed a point-of-care resident evaluation tool, the Mobile Medical Milestones Application (M3App), and piloted it in 8 North Carolina family medicine residency programs. We sought to examine variations we found in the use of the tool across programs and explored the experiences of program directors, faculty, and residents to better understand the perceived benefits and challenges of implementing the new tool. Residents and faculty completed presurveys and postsurveys about the tool and the evaluation process in their program. Program directors were interviewed individually. Interviews and open-ended survey responses were analyzed and coded using the constant comparative method, and responses were tabulated under themes. Common perceptions included increased data collection, enhanced efficiency, and increased perceived quality of the information gathered with the M3App. Residents appreciated the timely, high-quality feedback they received. Faculty reported becoming more comfortable with the tool over time, and a more favorable evaluation of the tool was associated with higher utilization. Program directors reported improvements in faculty knowledge of the milestones and resident satisfaction with feedback. Faculty and residents credited the M3App with improving the quality and efficiency of resident feedback. Residents appreciated the frequency, proximity, and specificity of feedback, and faculty reported the app improved their familiarity with the milestones. Implementation challenges included lack of a physician champion and competing demands on faculty time.

  2. Integrated hydraulic booster/tool string technology for unfreezing of stuck downhole strings in horizontal wells

    NASA Astrophysics Data System (ADS)

    Tian, Q. Z.

    2017-12-01

    It is common to use a jarring tool to unfreeze stuck downhole string. However, in a horizontal well, influenced by the friction caused by the deviated section, jarring effect is poor; on the other hand, the forcing point can be located in the horizontal section by a hydraulic booster and the friction can be reduced, but it is time-consuming and easy to break downhole string using a large-tonnage and constant pull force. A hydraulic booster - jar tool string has been developed for unfreezing operation in horizontal wells. The technical solution involves three elements: a two-stage parallel spring cylinder structure for increasing the energy storage capacity of spring accelerators; multiple groups of spring accelerators connected in series to increase the working stroke; a hydraulic booster intensifying jarring force. The integrated unfreezing tool string based on these three elements can effectively overcome the friction caused by a deviated borehole, and thus unfreeze a stuck string with the interaction of the hydraulic booster and the mechanical jar which form an alternatively dynamic load. Experimental results show that the jarring performance parameters of the hydraulic booster-jar unfreezing tool string for the horizontal wells are in accordance with original design requirements. Then field technical parameters were developed based on numerical simulation and experimental data. Field application shows that the hydraulic booster-jar unfreezing tool string is effective to free stuck downhole tools in a horizontal well, and it reduces hook load by 80% and lessens the requirement of workover equipment. This provides a new technology to unfreeze stuck downhole string in a horizontal well.

  3. Targeted proteins for diabetes drug design

    NASA Astrophysics Data System (ADS)

    Doan Trang Nguyen, Ngoc; Thi Le, Ly

    2012-03-01

    Type 2 diabetes mellitus is a common metabolism disorder characterized by high glucose in the bloodstream, especially in the case of insulin resistance and relative insulin deficiency. Nowadays, it is very common in middle-aged people and involves such dangerous symptoms as increasing risk of stroke, obesity and heart failure. In Vietnam, besides the common treatment of insulin injection, some herbal medication is used but no unified optimum remedy for the disease yet exists and there is no production of antidiabetic drugs in the domestic market yet. In the development of nanomedicine at the present time, drug design is considered as an innovative tool for researchers to study the mechanisms of diseases at the molecular level. The aim of this article is to review some common protein targets involved in type 2 diabetes, offering a new idea for designing new drug candidates to produce antidiabetic drugs against type 2 diabetes for Vietnamese people.

  4. Electronic implementation of national nursing standards--NANDA, NOC and NIC as an effective teaching tool.

    PubMed

    Allred, Sharon K; Smith, Kevin F; Flowers, Laura

    2004-01-01

    With the increased interest in evidence-based medicine, Internet access and the growing emphasis on national standards, there is an increased challenge for teaching institutions and nursing services to teach and implement standards. At the same time, electronic clinical documentation tools have started to become a common format for recording nursing notes. The major aim of this paper is to ascertain and assess the availability of clinical nursing tools based on the NANDA, NOC and NIC standards. Faculty at 20 large nursing schools and directors of nursing at 20 hospitals were interviewed regarding the use of nursing standards in clinical documentation packages, not only for teaching purposes but also for use in hospital-based systems to ensure patient safety. A survey tool was utilized that covered questions regarding what nursing standards are being taught in the nursing schools, what standards are encouraged by the hospitals, and teaching initiatives that include clinical documentation tools. Information was collected on how utilizing these standards in a clinical or hospital setting can improve the overall quality of care. Analysis included univariate and bivariate analysis. The consensus between both groups was that the NANDA, NOC and NIC national standards are the most widely taught and utilized. In addition, a training initiative was identified within a large university where a clinical documentation system based on these standards was developed utilizing handheld devices.

  5. Innovations in health information technologies for chronic pulmonary diseases.

    PubMed

    Himes, Blanca E; Weitzman, Elissa R

    2016-04-05

    Asthma and chronic obstructive pulmonary disease (COPD) are common chronic obstructive lung disorders in the US that affect over 49 million people. There is no cure for asthma or COPD, but clinical guidelines exist for controlling symptoms that are successful in most patients that adhere to their treatment plan. Health information technologies (HITs) are revolutionizing healthcare by becoming mainstream tools to assist patients in self-monitoring and decision-making, and subsequently, driving a shift toward a care model increasingly centered on personal adoption and use of digital and web-based tools. While the number of chronic pulmonary disease HITs is rapidly increasing, most have not been validated as clinically effective tools for the management of disease. Online communities for asthma and COPD patients are becoming sources of empowerment and support, as well as facilitators of patient-centered research efforts. In addition to empowering patients and facilitating disease self-management, HITs offer promise to aid researchers in identifying chronic pulmonary disease endotypes and personalized treatments based on patient-specific profiles that integrate symptom occurrence and medication usage with environmental and genomic data.

  6. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  7. Decision support frameworks and tools for conservation

    USGS Publications Warehouse

    Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.

    2018-01-01

    The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.

  8. Systematic Sustainability Assessment (SSA) Tool for Hydroelectric Project in Malaysia

    NASA Astrophysics Data System (ADS)

    Turan, Faiz Mohd; Johan, Kartina

    2017-08-01

    Sustainably developed and managed hydropower has enormous potential to contribute to global sustainability goals. It is known that hydroelectricity contributing small amounts to greenhouse gas emissions and other atmospheric pollutants. However, developing the remaining hydroelectric potential offers many challenges, and public pressure and expectations on the environmental and social performance of hydroelectric tend to increase over time. This paper aims to develop Systematic Sustainability Assessment (SSA) Tool that promotes and guides more sustainable hydroelectric projects in the context of Malaysia. The proposed SSA tool which not only provide a quality and quantitative report of sustainability performance but also act as Self-Assessment Report (SAR) to provide roadmap to achieve greater level of sustainability in project management for continuous improvement. It is expected to provide a common language that allow government, civil society, financial institutions and the hydroelectric sector to talk about and evaluate sustainability issues. The advantage of SSA tool is it can be used at any stage of hydroelectric development, from the earliest planning stages right through to operation.

  9. Simulation techniques in hyperthermia treatment planning

    PubMed Central

    Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC

    2013-01-01

    Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453

  10. Using a formal requirements management tool for system engineering: first results at ESO

    NASA Astrophysics Data System (ADS)

    Zamparelli, Michele

    2006-06-01

    The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.

  11. Assessing Bleeding Risk in Patients Taking Anticoagulants

    PubMed Central

    Shoeb, Marwa; Fang, Margaret C.

    2013-01-01

    Anticoagulant medications are commonly used for the prevention and treatment of thromboembolism. Although highly effective, they are also associated with significant bleeding risks. Numerous individual clinical factors have been linked to an increased risk of hemorrhage, including older age, anemia, and renal disease. To help quantify hemorrhage risk for individual patients, a number of clinical risk prediction tools have been developed. These risk prediction tools differ in how they were derived and how they identify and weight individual risk factors. At present, their ability to effective predict anticoagulant-associated hemorrhage remains modest. Use of risk prediction tools to estimate bleeding in clinical practice is most influential when applied to patients at the lower spectrum of thromboembolic risk, when the risk of hemorrhage will more strongly affect clinical decisions about anticoagulation. Using risk tools may also help counsel and inform patients about their potential risk for hemorrhage while on anticoagulants, and can identify patients who might benefit from more careful management of anticoagulation. PMID:23479259

  12. Role of solute-transport models in the analysis of groundwater salinity problems in agricultural areas

    USGS Publications Warehouse

    Konikow, Leonard F.

    1981-01-01

    Undesirable salinity increases occur in both groundwater and surface water and are commonly related to agricultural practices. Groundwater recharge from precipitation or irrigation will transport and disperse residual salts concentrated by evapotranspiration, salts leached from soil and aquifer materials, as well as some dissolved fertilizers and pesticides. Where stream salinity is affected by agricultural practices, the increases in salt load usually are attributable mostly to a groundwater component of flow. Thus, efforts to predict, manage, or control stream salinity increases should consider the role of groundwater in salt transport. Two examples of groundwater salinity problems in Colorado, U.S.A., illustrate that a model which simulates accurately the transport and dispersion of solutes in flowing groundwater can be (1) a valuable investigative tool to help understand the processes and parameters controlling the movement and fate of the salt, and (2) a valuable management tool for predicting responses and optimizing the development and use of the total water resource. ?? 1981.

  13. Redesigning a Large-Enrollment Introductory Biology Course

    PubMed Central

    Ueckert, Catherine; Adams, Alison; Lock, Judith

    2011-01-01

    Using an action research model, biology faculty examined, implemented, and evaluated learner-centered instructional strategies to reach the goal of increasing the level of student achievement in the introductory biology course BIO 181: Unity of Life I, which was characterized by both high enrollments and a high DFW rate. Outcomes included the creation and implementation of an assessment tool for biology content knowledge and attitudes, development and implementation of a common syllabus, modification of the course to include learner-centered instructional strategies, and the collection and analysis of data to evaluate the success of the modifications. The redesigned course resulted in greater student success, as measured by grades (reduced %DFW and increased %AB) as well as by achievement in the course assessment tool. In addition, the redesigned course led to increased student satisfaction and greater consistency among different sections. These findings have important implications for both students and institutions, as the significantly lower DFW rate means that fewer students have to retake the course. PMID:21633065

  14. Electronic health record interventions at the point of care improve documentation of care processes and decrease orders for genetic tests commonly ordered by nongeneticists.

    PubMed

    Scheuner, Maren T; Peredo, Jane; Tangney, Kelly; Schoeff, Diane; Sale, Taylor; Lubick-Goldzweig, Caroline; Hamilton, Alison; Hilborne, Lee; Lee, Martin; Mittman, Brian; Yano, Elizabeth M; Lubin, Ira M

    2017-01-01

    To determine whether electronic health record (EHR) tools improve documentation of pre- and postanalytic care processes for genetic tests ordered by nongeneticists. We conducted a nonrandomized, controlled, pre-/postintervention study of EHR point-of-care tools (informational messages and template report) for three genetic tests. Chart review assessed documentation of genetic testing processes of care, with points assigned for each documented item. Multiple linear and logistic regressions assessed factors associated with documentation. Preimplementation, there were no significant site differences (P > 0.05). Postimplementation, mean documentation scores increased (5.9 (2.1) vs. 5.0 (2.2); P = 0.0001) and records with clinically meaningful documentation increased (score >5: 59 vs. 47%; P = 0.02) at the intervention versus the control site. Pre- and postimplementation, a score >5 was positively associated with abnormal test results (OR = 4.0; 95% CI: 1.8-9.2) and trainee provider (OR = 2.3; 95% CI: 1.2-4.6). Postimplementation, a score >5 was also positively associated with intervention site (OR = 2.3; 95% CI: 1.1-5.1) and specialty clinic (OR = 2.0; 95% CI: 1.1-3.6). There were also significantly fewer tests ordered after implementation (264/100,000 vs. 204/100,000; P = 0.03), with no significant change at the control site (280/100,000 vs. 257/100,000; P = 0.50). EHR point-of-care tools improved documentation of genetic testing processes and decreased utilization of genetic tests commonly ordered by nongeneticists.Genet Med 19 1, 112-120.

  15. Perceived Utility of Pharmacy Licensure Examination Preparation Tools

    PubMed Central

    Peak, Amy Sutton; Sheehan, Amy Heck; Arnett, Stephanie

    2006-01-01

    Objectives To identify board examination preparation tools most commonly used by recent pharmacy graduates and determine which tools are perceived as most valuable and representative of the actual content of licensure examinations. Methods An electronic survey was sent to all 2004 graduates of colleges of pharmacy in Indiana. Participants identified which specific preparation tools were used and rated tools based on usefulness, representativeness of licensure examination, and monetary value, and provided overall recommendations to future graduates. Results The most commonly used preparation tools were the Pharmacy Law Review Session offered by Dr. Thomas Wilson at Purdue University, the Complete Review for Pharmacy, Pre-NAPLEX, PharmPrep, and the Kaplan NAPLEX Review. Tools receiving high ratings in all categories included Dr. Wilson's Pharmacy Law Review Session, Pre-NAPLEX, Comprehensive Pharmacy Review, Kaplan NAPLEX Review, and Review of Pharmacy. Conclusions Although no preparation tool was associated with a higher examination pass rate, certain tools were clearly rated higher than others by test takers. PMID:17149406

  16. Automatically Detecting Failures in Natural Language Processing Tools for Online Community Text.

    PubMed

    Park, Albert; Hartzler, Andrea L; Huh, Jina; McDonald, David W; Pratt, Wanda

    2015-08-31

    The prevalence and value of patient-generated health text are increasing, but processing such text remains problematic. Although existing biomedical natural language processing (NLP) tools are appealing, most were developed to process clinician- or researcher-generated text, such as clinical notes or journal articles. In addition to being constructed for different types of text, other challenges of using existing NLP include constantly changing technologies, source vocabularies, and characteristics of text. These continuously evolving challenges warrant the need for applying low-cost systematic assessment. However, the primarily accepted evaluation method in NLP, manual annotation, requires tremendous effort and time. The primary objective of this study is to explore an alternative approach-using low-cost, automated methods to detect failures (eg, incorrect boundaries, missed terms, mismapped concepts) when processing patient-generated text with existing biomedical NLP tools. We first characterize common failures that NLP tools can make in processing online community text. We then demonstrate the feasibility of our automated approach in detecting these common failures using one of the most popular biomedical NLP tools, MetaMap. Using 9657 posts from an online cancer community, we explored our automated failure detection approach in two steps: (1) to characterize the failure types, we first manually reviewed MetaMap's commonly occurring failures, grouped the inaccurate mappings into failure types, and then identified causes of the failures through iterative rounds of manual review using open coding, and (2) to automatically detect these failure types, we then explored combinations of existing NLP techniques and dictionary-based matching for each failure cause. Finally, we manually evaluated the automatically detected failures. From our manual review, we characterized three types of failure: (1) boundary failures, (2) missed term failures, and (3) word ambiguity failures. Within these three failure types, we discovered 12 causes of inaccurate mappings of concepts. We used automated methods to detect almost half of 383,572 MetaMap's mappings as problematic. Word sense ambiguity failure was the most widely occurring, comprising 82.22% of failures. Boundary failure was the second most frequent, amounting to 15.90% of failures, while missed term failures were the least common, making up 1.88% of failures. The automated failure detection achieved precision, recall, accuracy, and F1 score of 83.00%, 92.57%, 88.17%, and 87.52%, respectively. We illustrate the challenges of processing patient-generated online health community text and characterize failures of NLP tools on this patient-generated health text, demonstrating the feasibility of our low-cost approach to automatically detect those failures. Our approach shows the potential for scalable and effective solutions to automatically assess the constantly evolving NLP tools and source vocabularies to process patient-generated text.

  17. The collaboratory for MS3D: a new cyberinfrastructure for the structural elucidation of biological macromolecules and their assemblies using mass spectrometry-based approaches.

    PubMed

    Yu, Eizadora T; Hawkins, Arie; Kuntz, Irwin D; Rahn, Larry A; Rothfuss, Andrew; Sale, Kenneth; Young, Malin M; Yang, Christine L; Pancerella, Carmen M; Fabris, Daniele

    2008-11-01

    Modern biomedical research is evolving with the rapid growth of diverse data types, biophysical characterization methods, computational tools and extensive collaboration among researchers spanning various communities and having complementary backgrounds and expertise. Collaborating researchers are increasingly dependent on shared data and tools made available by other investigators with common interests, thus forming communities that transcend the traditional boundaries of the single research laboratory or institution. Barriers, however, remain to the formation of these virtual communities, usually due to the steep learning curve associated with becoming familiar with new tools, or with the difficulties associated with transferring data between tools. Recognizing the need for shared reference data and analysis tools, we are developing an integrated knowledge environment that supports productive interactions among researchers. Here we report on our current collaborative environment, which focuses on bringing together structural biologists working in the area of mass spectrometric based methods for the analysis of tertiary and quaternary macromolecular structures (MS3D) called the Collaboratory for MS3D (C-MS3D). C-MS3D is a Web-portal designed to provide collaborators with a shared work environment that integrates data storage and management with data analysis tools. Files are stored and archived along with pertinent meta data in such a way as to allow file handling to be tracked (data provenance) and data files to be searched using keywords and modification dates. While at this time the portal is designed around a specific application, the shared work environment is a general approach to building collaborative work groups. The goal of this is to not only provide a common data sharing and archiving system, but also to assist in the building of new collaborations and to spur the development of new tools and technologies.

  18. Does the Market Value Value-Added? Evidence from Housing Prices after a Public Release of School and Teacher Value-Added. Working Paper #47

    ERIC Educational Resources Information Center

    Imberman, Scott; Lovenheim, Michael F.

    2015-01-01

    Value-added data have become an increasingly common evaluation tool for schools and teachers. Many school districts have begun to adopt these methods and have released results publicly. In this paper, we use the unique public release of value-added data in Los Angeles to identify how this measure of school quality is capitalized into housing…

  19. PubMed Central

    Worrall, Graham; Chambers, Larry W.

    1990-01-01

    With the increasing expenditure on health care programs for seniors, there is an urgent need to evaluate such programs. The Measurement Iterative Loop is a tool that can provide both health administrators and health researchers with a method of evaluation of existing programs and identification of gaps in knowledge, and forms a rational basis for health-care policy decisions. In this article, the Loop is applied to one common problem of the elderly: dementia. PMID:21233998

  20. THE USE OF THE ELEVATED PLUS MAZE IN THE TOXICOLOGY LABORAOTRY: PILOT STUDIES AND ASSESSMENT OF ANXIETY IN RATS EXPOSED TO LEAD ACETATE OR SUB-CHRONIC LEVELS OF TOLUENE.

    EPA Science Inventory

    A common complaint of individuals exposed to neurotoxic agents is increased anxiety.

    Rat models of the effects of long-term exposure to environmental chemicals on anxiety

    are lacking. The elevated plus-maze (EPM) is a widely used tool in the search for new

  1. The Effect of Fertilization on Sap flux and Canopy Conductance in a Eucalyptus saligna Experimental Forest

    Treesearch

    R. Hubbard; M. Ryan; C. Giardina; H. Barnard

    2004-01-01

    Land devoted to plantation forestry (50 million ha) has been increasing worldwide the genus Eucalyptus is a popular plantation species (14 million ha) for its rapid growth and ability to grow well on a wide range of sites. Fertilization is a common silvicultural tool to improve tree growth with potential effects on stand water use, but relationship between wood growth...

  2. Molecular dynamics modeling of bonding two materials by atomic scale friction stir welding at different process parameters

    NASA Astrophysics Data System (ADS)

    Konovalenko S., Iv.; Psakhie, S. G.

    2017-12-01

    Using the molecular dynamics method, we simulated the atomic scale butt friction stir welding on two crystallites and varied the onset FSW tool plunge depth. The effects of the plunge depth value on the thermomechanical evolution of nanosized crystallites and mass transfer in the course of FSW have been studied. The increase of plunge depth values resulted in more intense heating and reducing the plasticized metal resistance to the tool movement. The mass transfer intensity was hardly dependent on the plunge depth value. The plunge depth was recommended to be used as a FSW process control parameter in addition to the commonly used ones.

  3. ConoDictor: a tool for prediction of conopeptide superfamilies.

    PubMed

    Koua, Dominique; Brauer, Age; Laht, Silja; Kaplinski, Lauris; Favreau, Philippe; Remm, Maido; Lisacek, Frédérique; Stöcklin, Reto

    2012-07-01

    ConoDictor is a tool that enables fast and accurate classification of conopeptides into superfamilies based on their amino acid sequence. ConoDictor combines predictions from two complementary approaches-profile hidden Markov models and generalized profiles. Results appear in a browser as tables that can be downloaded in various formats. This application is particularly valuable in view of the exponentially increasing number of conopeptides that are being identified. ConoDictor was written in Perl using the common gateway interface module with a php submission page. Sequence matching is performed with hmmsearch from HMMER 3 and ps_scan.pl from the pftools 2.3 package. ConoDictor is freely accessible at http://conco.ebc.ee.

  4. Optimising the use of observational electronic health record data: Current issues, evolving opportunities, strategies and scope for collaboration.

    PubMed

    Liaw, Siaw-Teng; Powell-Davies, Gawaine; Pearce, Christopher; Britt, Helena; McGlynn, Lisa; Harris, Mark F

    2016-03-01

    With increasing computerisation in general practice, national primary care networks are mooted as sources of data for health services and population health research and planning. Existing data collection programs - MedicinesInsight, Improvement Foundation, Bettering the Evaluation and Care of Health (BEACH) - vary in purpose, governance, methodologies and tools. General practitioners (GPs) have significant roles as collectors, managers and users of electronic health record (EHR) data. They need to understand the challenges to their clinical and managerial roles and responsibilities. The aim of this article is to examine the primary and secondary use of EHR data, identify challenges, discuss solutions and explore directions. Representatives from existing programs, Medicare Locals, Local Health Districts and research networks held workshops on the scope, challenges and approaches to the quality and use of EHR data. Challenges included data quality, interoperability, fragmented governance, proprietary software, transparency, sustainability, competing ethical and privacy perspectives, and cognitive load on patients and clinicians. Proposed solutions included effective change management; transparent governance and management of intellectual property, data quality, security, ethical access, and privacy; common data models, metadata and tools; and patient/community engagement. Collaboration and common approaches to tools, platforms and governance are needed. Processes and structures must be transparent and acceptable to GPs.

  5. Use of a trigger tool to detect adverse drug reactions in an emergency department.

    PubMed

    de Almeida, Silvana Maria; Romualdo, Aruana; de Abreu Ferraresi, Andressa; Zelezoglo, Giovana Roberta; Marra, Alexandre R; Edmond, Michael B

    2017-11-15

    Although there are systems for reporting adverse drug reactions (ADR), these safety events remain under reported. The low-cost, low-tech trigger tool method is based on the detection of events through clues, and it seems to increase the detection of adverse events compared to traditional methodologies. This study seeks to estimate the prevalence of adverse reactions to drugs in patients seeking care in the emergency department. Retrospective study from January to December, 2014, applying the Institute for Healthcare Improvement (IHI) trigger tool methodology for patients treated at the emergency room of a tertiary care hospital. The estimated prevalence of adverse reactions in patients presenting to the emergency department was 2.3% [CI 95 1.3% to 3.3%]; 28.6% of cases required hospitalization at an average cost of US$ 5698.44. The most common triggers were hydrocortisone (57% of the cases), diphenhydramine (14%) and fexofenadine (14%). Anti-infectives (19%), cardiovascular agents (14%), and musculoskeletal drugs (14%) were the most common causes of adverse reactions. According to the Naranjo Scale, 71% were classified as possible and 29% as probable. There was no association between adverse reactions and age and sex in the present study. The use of the trigger tool to identify adverse reactions in the emergency department was possible to identify a prevalence of 2.3%. It showed to be a viable method that can provide a better understanding of adverse drug reactions in this patient population.

  6. What if Finding Data was as Easy as Subscribing to the News?

    NASA Astrophysics Data System (ADS)

    Duerr, R. E.

    2011-12-01

    Data are the "common wealth of humanity," the fuel that drives the sciences; but much of the data that exist are inaccessible, buried in one of numerous stove-piped data systems, or entirely hidden unless you have direct knowledge of and contact with the investigator that acquired them. Much of the "wealth" is squandered and overall scientific progress inhibited, a situation that is becoming increasingly untenable with the openness required by data-driven science. What are needed are simple interoperability protocols and advertising mechanisms that allow data from disparate data systems to be easily discovered, explored, and accessed. The tools must be simple enough that individual investigators can use them without IT support. The tools cannot rely on centralized repositories or registries but must enable the development of ad-hoc or special purpose aggregations of data and services tailored to individual community needs. In addition, the protocols must scale to support the discovery of and access to the holdings of the global, interdisciplinary community, be they individual investigators or major data centers. NSIDC, in conjunction with other members of the Federation of Earth Science Information Partners and the Polar Information Commons, are working on just such a suite of tools and protocols. In this talk, I discuss data and service casting, aggregation, data badging, and OpenSearch - a suite of tools and protocols which, when used in conjunction with each other, have the potential of completely changing the way that data and services worldwide are discovered and used.

  7. MRI in the assessment and monitoring of multiple sclerosis: an update on best practice

    PubMed Central

    Kaunzner, Ulrike W.; Gauthier, Susan A.

    2017-01-01

    Magnetic resonance imaging (MRI) has developed into the most important tool for the diagnosis and monitoring of multiple sclerosis (MS). Its high sensitivity for the evaluation of inflammatory and neurodegenerative processes in the brain and spinal cord has made it the most commonly used technique for the evaluation of patients with MS. Moreover, MRI has become a powerful tool for treatment monitoring, safety assessment as well as for the prognostication of disease progression. Clinically, the use of MRI has increased in the past couple decades as a result of improved technology and increased availability that now extends well beyond academic centers. Consequently, there are numerous studies supporting the role of MRI in the management of patients with MS. The aim of this review is to summarize the latest insights into the utility of MRI in MS. PMID:28607577

  8. Semantic Interoperability of Health Risk Assessments

    PubMed Central

    Rajda, Jay; Vreeman, Daniel J.; Wei, Henry G.

    2011-01-01

    The health insurance and benefits industry has administered Health Risk Assessments (HRAs) at an increasing rate. These are used to collect data on modifiable health risk factors for wellness and disease management programs. However, there is significant variability in the semantics of these assessments, making it difficult to compare data sets from the output of 2 different HRAs. There is also an increasing need to exchange this data with Health Information Exchanges and Electronic Medical Records. To standardize the data and concepts from these tools, we outline a process to determine presence of certain common elements of modifiable health risk extracted from these surveys. This information is coded using concept identifiers, which allows cross-survey comparison and analysis. We propose that using LOINC codes or other universal coding schema may allow semantic interoperability of a variety of HRA tools across the industry, research, and clinical settings. PMID:22195174

  9. Common Core State Standards: Implementation Tools and Resources

    ERIC Educational Resources Information Center

    Council of Chief State School Officers, 2013

    2013-01-01

    The Council of Chief State School Officers (CCSSO or the Council) developed this list of free tools and resources to support state education agencies, districts, and educators during the process of implementing the Common Core State Standards (CCSS). This document primarily lists resources developed by CCSSO and other leading organizations and is…

  10. Detection of copy number variations in epilepsy using exome data.

    PubMed

    Tsuchida, N; Nakashima, M; Kato, M; Heyman, E; Inui, T; Haginoya, K; Watanabe, S; Chiyonobu, T; Morimoto, M; Ohta, M; Kumakura, A; Kubota, M; Kumagai, Y; Hamano, S-I; Lourenco, C M; Yahaya, N A; Ch'ng, G-S; Ngu, L-H; Fattal-Valevski, A; Weisz Hubshman, M; Orenstein, N; Marom, D; Cohen, L; Goldberg-Stern, H; Uchiyama, Y; Imagawa, E; Mizuguchi, T; Takata, A; Miyake, N; Nakajima, H; Saitsu, H; Miyatake, S; Matsumoto, N

    2018-03-01

    Epilepsies are common neurological disorders and genetic factors contribute to their pathogenesis. Copy number variations (CNVs) are increasingly recognized as an important etiology of many human diseases including epilepsy. Whole-exome sequencing (WES) is becoming a standard tool for detecting pathogenic mutations and has recently been applied to detecting CNVs. Here, we analyzed 294 families with epilepsy using WES, and focused on 168 families with no causative single nucleotide variants in known epilepsy-associated genes to further validate CNVs using 2 different CNV detection tools using WES data. We confirmed 18 pathogenic CNVs, and 2 deletions and 2 duplications at chr15q11.2 of clinically unknown significance. Of note, we were able to identify small CNVs less than 10 kb in size, which might be difficult to detect by conventional microarray. We revealed 2 cases with pathogenic CNVs that one of the 2 CNV detection tools failed to find, suggesting that using different CNV tools is recommended to increase diagnostic yield. Considering a relatively high discovery rate of CNVs (18 out of 168 families, 10.7%) and successful detection of CNV with <10 kb in size, CNV detection by WES may be able to surrogate, or at least complement, conventional microarray analysis. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    PubMed Central

    Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.

    2017-01-01

    Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135

  12. Concordant parent-child reports of anxiety predict impairment in youth with functional abdominal pain

    PubMed Central

    Cunningham, Natoshia Raishevich; Cohen, Mitchell B.; Farrell, Michael K.; Mezoff, Adam G.; Lynch-Jordan, Anne; Kashikar-Zuck, Susmita

    2014-01-01

    Introduction Functional abdominal pain (FAP) is associated with significant anxiety and impairment. Prior investigations of child anxiety in youth with FAP are generally limited by small sample sizes, based on child report, and use lengthy diagnostic tools. It is unknown 1) if a brief anxiety screening tool is feasible, 2) whether parent and child reports of anxiety are congruent, and 3) whether parent and child agreement of child anxiety corresponds to increased impairment. The purpose of this investigation was to examine anxiety characteristics in youth with FAP using parent and child reports. Parent-child agreement of child anxiety symptoms was examined in relation to pain and disability. Materials and Methods One-hundred patients with FAP (8-18 years of age) recruited from pediatric gastroenterology clinics completed measures of pain intensity (Numeric Rating Scale), and disability (Functional Disability Inventory). Patients and caregivers both completed a measure of child anxiety characteristics (Screen for Child Anxiety and Related Disorders). Results Clinically significant anxiety symptoms were more commonly reported by youth (54%) than their parents (30%). Panic/somatic symptoms, generalized anxiety, and separation anxiety were most commonly endorsed by patients whereas generalized anxiety, separation anxiety, and school avoidance were most commonly reported by parents. The majority (65%) of parents and children agreed on presence (26%) or absence (39%) of clinically significant anxiety. Parent-child agreement of clinically significant anxiety was related to increased impairment. Discussion A brief screening instrument of parent and child reports of anxiety can provide clinically relevant information for comprehensive treatment planning in children with FAP. PMID:25714575

  13. Evaluating endothelial function of the common carotid artery: an in vivo human model.

    PubMed

    Mazzucco, S; Bifari, F; Trombetta, M; Guidi, G C; Mazzi, M; Anzola, G P; Rizzuto, N; Bonadonna, R

    2009-03-01

    Flow mediated dilation (FMD) of peripheral conduit arteries is a well-established tool to evaluate endothelial function. The aims of this study are to apply the FMD model to cerebral circulation by using acetazolamide (ACZ)-induced intracranial vasodilation as a stimulus to increase common carotid artery (CCA) diameter in response to a local increase of blood flow velocity (BFV). In 15 healthy subjects, CCA end-diastolic diameter and BFV, middle cerebral artery (MCA) BFV and mean arterial blood pressure (MBP) were measured at basal conditions, after an intravenous bolus of 1g ACZ, and after placebo (saline) sublingual administration at the 15th and 20th minute. In a separate session, the same parameters were evaluated after placebo (saline) infusion instead of ACZ and after 10 microg/m(2) bs and 300 microg of glyceryl trinitrate (GTN), administered sublingually, at the 15th and 20th minute, respectively. After ACZ bolus, there was a 35% maximal MCA mean BFV increment (14th minute), together with a 22% increase of mean CCA end-diastolic BFV and a CCA diameter increment of 3.9% at the 3rd minute (p=0.024). There were no MBP significant variations up to the 15th minute (p=0.35). After GTN administration, there was a significant increment in CCA diameter (p<0.00001). ACZ causes a detectable CCA dilation in healthy individuals concomitantly with an increase in BFV. Upon demonstration that this phenomenon is endothelium dependent, this experimental model might become a valuable tool to assess endothelial function in the carotid artery.

  14. Barefoot vs common footwear: A systematic review of the kinematic, kinetic and muscle activity differences during walking.

    PubMed

    Franklin, Simon; Grey, Michael J; Heneghan, Nicola; Bowen, Laura; Li, François-Xavier

    2015-09-01

    Habitual footwear use has been reported to influence foot structure with an acute exposure being shown to alter foot position and mechanics. The foot is highly specialised thus these changes in structure/position could influence functionality. This review aims to investigate the effect of footwear on gait, specifically focussing on studies that have assessed kinematics, kinetics and muscle activity between walking barefoot and in common footwear. In line with PRISMA and published guidelines, a literature search was completed across six databases comprising Medline, EMBASE, Scopus, AMED, Cochrane Library and Web of Science. Fifteen of 466 articles met the predetermined inclusion criteria and were included in the review. All articles were assessed for methodological quality using a modified assessment tool based on the STROBE statement for reporting observational studies and the CASP appraisal tool. Walking barefoot enables increased forefoot spreading under load and habitual barefoot walkers have anatomically wider feet. Spatial-temporal differences including, reduced step/stride length and increased cadence, are observed when barefoot. Flatter foot placement, increased knee flexion and a reduced peak vertical ground reaction force at initial contact are also reported. Habitual barefoot walkers exhibit lower peak plantar pressures and pressure impulses, whereas peak plantar pressures are increased in the habitually shod wearer walking barefoot. Footwear particularly affects the kinematics and kinetics of gait acutely and chronically. Little research has been completed in older age populations (50+ years) and thus further research is required to better understand the effect of footwear on walking across the lifespan. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Large High Resolution Displays for Co-Located Collaborative Sensemaking: Display Usage and Territoriality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradel, Lauren; Endert, Alexander; Koch, Kristen

    2013-08-01

    Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less

  16. The effect of genetic test-based risk information on behavioral outcomes: A critical examination of failed trials and a call to action.

    PubMed

    Austin, Jehannine

    2015-12-01

    Encouraging individuals at risk for common complex disease like heart disease, cancer, and diabetes to adopt lifestyle changes (e.g., smoking cessation, exercise, proper nutrition, increased screening) could be powerful public health tools to decrease the enormous personal and economic burden of these conditions. Theoretically, genetic risk information appears to be a compelling tool that could be used to provoke at-risk individuals to adopt these lifestyle changes. Unfortunately, however, numerous studies now have shown that providing individuals with genetic test-based risk information has little to no impact on their behavior. In this article (a commentary not a systematic review), the failed trials in which genetic information has been used as a tool to induce behavior change will be critically examined in order to identify new and potentially more effective ways forward. © 2015 Wiley Periodicals, Inc.

  17. A web-based 3D visualisation and assessment system for urban precinct scenario modelling

    NASA Astrophysics Data System (ADS)

    Trubka, Roman; Glackin, Stephen; Lade, Oliver; Pettit, Chris

    2016-07-01

    Recent years have seen an increasing number of spatial tools and technologies for enabling better decision-making in the urban environment. They have largely arisen because of the need for cities to be more efficiently planned to accommodate growing populations while mitigating urban sprawl, and also because of innovations in rendering data in 3D being well suited for visualising the urban built environment. In this paper we review a number of systems that are better known and more commonly used in the field of urban planning. We then introduce Envision Scenario Planner (ESP), a web-based 3D precinct geodesign, visualisation and assessment tool, developed using Agile and Co-design methods. We provide a comprehensive account of the tool, beginning with a discussion of its design and development process and concluding with an example use case and a discussion of the lessons learned in its development.

  18. Digital fabrication of textiles: an analysis of electrical networks in 3D knitted functional fabrics

    NASA Astrophysics Data System (ADS)

    Vallett, Richard; Knittel, Chelsea; Christe, Daniel; Castaneda, Nestor; Kara, Christina D.; Mazur, Krzysztof; Liu, Dani; Kontsos, Antonios; Kim, Youngmoo; Dion, Genevieve

    2017-05-01

    Digital fabrication methods are reshaping design and manufacturing processes through the adoption of pre-production visualization and analysis tools, which help minimize waste of materials and time. Despite the increasingly widespread use of digital fabrication techniques, comparatively few of these advances have benefited the design and fabrication of textiles. The development of functional fabrics such as knitted touch sensors, antennas, capacitors, and other electronic textiles could benefit from the same advances in electrical network modeling that revolutionized the design of integrated circuits. In this paper, the efficacy of using current state-of-the-art digital fabrication tools over the more common trialand- error methods currently used in textile design is demonstrated. Gaps are then identified in the current state-of-the-art tools that must be resolved to further develop and streamline the rapidly growing field of smart textiles and devices, bringing textile production into the realm of 21st century manufacturing.

  19. Facilitation of Function and Manipulation Knowledge of Tools Using Transcranial Direct Current Stimulation (tDCS).

    PubMed

    Ishibashi, Ryo; Mima, Tatsuya; Fukuyama, Hidenao; Pobric, Gorana

    2017-01-01

    Using a variety of tools is a common and essential component of modern human life. Patients with brain damage or neurological disorders frequently have cognitive deficits in their recognition and manipulation of tools. In this study, we focused on improving tool-related cognition using transcranial direct current stimulation (tDCS). Converging evidence from neuropsychology, neuroimaging and non- invasive brain stimulation has identified the anterior temporal lobe (ATL) and inferior parietal lobule (IPL) as brain regions supporting action semantics. We observed enhanced performance in tool cognition with anodal tDCS over ATL and IPL in two cognitive tasks that require rapid access to semantic knowledge about the function or manipulation of common tools. ATL stimulation improved access to both function and manipulation knowledge of tools. The effect of IPL stimulation showed a trend toward better manipulation judgments. Our findings support previous studies of tool semantics and provide a novel approach for manipulation of underlying circuits.

  20. Fall Risk Assessment Tools for Elderly Living in the Community: Can We Do Better?

    PubMed

    Palumbo, Pierpaolo; Palmerini, Luca; Bandinelli, Stefania; Chiari, Lorenzo

    2015-01-01

    Falls are a common, serious threat to the health and self-confidence of the elderly. Assessment of fall risk is an important aspect of effective fall prevention programs. In order to test whether it is possible to outperform current prognostic tools for falls, we analyzed 1010 variables pertaining to mobility collected from 976 elderly subjects (InCHIANTI study). We trained and validated a data-driven model that issues probabilistic predictions about future falls. We benchmarked the model against other fall risk indicators: history of falls, gait speed, Short Physical Performance Battery (Guralnik et al. 1994), and the literature-based fall risk assessment tool FRAT-up (Cattelani et al. 2015). Parsimony in the number of variables included in a tool is often considered a proxy for ease of administration. We studied how constraints on the number of variables affect predictive accuracy. The proposed model and FRAT-up both attained the same discriminative ability; the area under the Receiver Operating Characteristic (ROC) curve (AUC) for multiple falls was 0.71. They outperformed the other risk scores, which reported AUCs for multiple falls between 0.64 and 0.65. Thus, it appears that both data-driven and literature-based approaches are better at estimating fall risk than commonly used fall risk indicators. The accuracy-parsimony analysis revealed that tools with a small number of predictors (~1-5) were suboptimal. Increasing the number of variables improved the predictive accuracy, reaching a plateau at ~20-30, which we can consider as the best trade-off between accuracy and parsimony. Obtaining the values of these ~20-30 variables does not compromise usability, since they are usually available in comprehensive geriatric assessments.

  1. Critical appraisal of nonrandomized studies-A review of recommended and commonly used tools.

    PubMed

    Quigley, Joan M; Thompson, Juliette C; Halfpenny, Nicholas J; Scott, David A

    2018-02-27

    When randomized controlled trial data are limited or unavailable, or to supplement randomized controlled trial evidence, health technology assessment (HTA) agencies may rely on systematic reviews of nonrandomized studies (NRSs) for evidence of the effectiveness of health care interventions. NRS designs may introduce considerable bias into systematic reviews, and several methodologies by which to evaluate this risk of bias are available. This study aimed to identify tools commonly used to assess bias in NRS and determine those recommended by HTA bodies. Appraisal tools used in NRS were identified through a targeted search of systematic reviews (January 2013-March 2017; MEDLINE and EMBASE [OVID SP]). Recommendations for the critical appraisal of NRS by expert review groups and HTA bodies were reviewed. From the 686 studies included in the narrative synthesis, 48 critical appraisal tools were identified. Commonly used tools included the Newcastle-Ottawa Scale, the methodological index for NRS, and bespoke appraisal tools. Neither the Cochrane Handbook nor the Centre for Reviews and Dissemination recommends a particular instrument for the assessment of risk of bias in NRS, although Cochrane has recently developed their own NRS critical appraisal tool. Among HTA bodies, only the Canadian Agency for Drugs and Technologies in Health recommends use of a specific critical appraisal tool-SIGN 50 (for cohort or case-control studies). Several criteria including reporting, external validity, confounding, and power were examined. There is no consensus between HTA groups on the preferred appraisal tool. Reviewers should select from a suite of tools on the basis of the design of studies included in their review. © 2018 John Wiley & Sons, Ltd.

  2. Evaluation of PHI Hunter in Natural Language Processing Research.

    PubMed

    Redd, Andrew; Pickard, Steve; Meystre, Stephane; Scehnet, Jeffrey; Bolton, Dan; Heavirland, Julia; Weaver, Allison Lynn; Hope, Carol; Garvin, Jennifer Hornung

    2015-01-01

    We introduce and evaluate a new, easily accessible tool using a common statistical analysis and business analytics software suite, SAS, which can be programmed to remove specific protected health information (PHI) from a text document. Removal of PHI is important because the quantity of text documents used for research with natural language processing (NLP) is increasing. When using existing data for research, an investigator must remove all PHI not needed for the research to comply with human subjects' right to privacy. This process is similar, but not identical, to de-identification of a given set of documents. PHI Hunter removes PHI from free-form text. It is a set of rules to identify and remove patterns in text. PHI Hunter was applied to 473 Department of Veterans Affairs (VA) text documents randomly drawn from a research corpus stored as unstructured text in VA files. PHI Hunter performed well with PHI in the form of identification numbers such as Social Security numbers, phone numbers, and medical record numbers. The most commonly missed PHI items were names and locations. Incorrect removal of information occurred with text that looked like identification numbers. PHI Hunter fills a niche role that is related to but not equal to the role of de-identification tools. It gives research staff a tool to reasonably increase patient privacy. It performs well for highly sensitive PHI categories that are rarely used in research, but still shows possible areas for improvement. More development for patterns of text and linked demographic tables from electronic health records (EHRs) would improve the program so that more precise identifiable information can be removed. PHI Hunter is an accessible tool that can flexibly remove PHI not needed for research. If it can be tailored to the specific data set via linked demographic tables, its performance will improve in each new document set.

  3. Evaluation of PHI Hunter in Natural Language Processing Research

    PubMed Central

    Redd, Andrew; Pickard, Steve; Meystre, Stephane; Scehnet, Jeffrey; Bolton, Dan; Heavirland, Julia; Weaver, Allison Lynn; Hope, Carol; Garvin, Jennifer Hornung

    2015-01-01

    Objectives We introduce and evaluate a new, easily accessible tool using a common statistical analysis and business analytics software suite, SAS, which can be programmed to remove specific protected health information (PHI) from a text document. Removal of PHI is important because the quantity of text documents used for research with natural language processing (NLP) is increasing. When using existing data for research, an investigator must remove all PHI not needed for the research to comply with human subjects’ right to privacy. This process is similar, but not identical, to de-identification of a given set of documents. Materials and methods PHI Hunter removes PHI from free-form text. It is a set of rules to identify and remove patterns in text. PHI Hunter was applied to 473 Department of Veterans Affairs (VA) text documents randomly drawn from a research corpus stored as unstructured text in VA files. Results PHI Hunter performed well with PHI in the form of identification numbers such as Social Security numbers, phone numbers, and medical record numbers. The most commonly missed PHI items were names and locations. Incorrect removal of information occurred with text that looked like identification numbers. Discussion PHI Hunter fills a niche role that is related to but not equal to the role of de-identification tools. It gives research staff a tool to reasonably increase patient privacy. It performs well for highly sensitive PHI categories that are rarely used in research, but still shows possible areas for improvement. More development for patterns of text and linked demographic tables from electronic health records (EHRs) would improve the program so that more precise identifiable information can be removed. Conclusions PHI Hunter is an accessible tool that can flexibly remove PHI not needed for research. If it can be tailored to the specific data set via linked demographic tables, its performance will improve in each new document set. PMID:26807078

  4. Teaching to the Common Core by Design, Not Accident

    ERIC Educational Resources Information Center

    Phillips, Vicki; Wong, Carina

    2012-01-01

    The Bill & Melinda Gates Foundation has created tools and supports intended to help teachers adapt to the Common Core State Standards in English language arts and mathematics. The tools seek to find the right balance between encouraging teachers' creativity and giving them enough guidance to ensure quality. They are the product of two years of…

  5. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  6. Evaluation of in silico tools to predict the skin sensitization potential of chemicals.

    PubMed

    Verheyen, G R; Braeken, E; Van Deun, K; Van Miert, S

    2017-01-01

    Public domain and commercial in silico tools were compared for their performance in predicting the skin sensitization potential of chemicals. The packages were either statistical based (Vega, CASE Ultra) or rule based (OECD Toolbox, Toxtree, Derek Nexus). In practice, several of these in silico tools are used in gap filling and read-across, but here their use was limited to make predictions based on presence/absence of structural features associated to sensitization. The top 400 ranking substances of the ATSDR 2011 Priority List of Hazardous Substances were selected as a starting point. Experimental information was identified for 160 chemically diverse substances (82 positive and 78 negative). The prediction for skin sensitization potential was compared with the experimental data. Rule-based tools perform slightly better, with accuracies ranging from 0.6 (OECD Toolbox) to 0.78 (Derek Nexus), compared with statistical tools that had accuracies ranging from 0.48 (Vega) to 0.73 (CASE Ultra - LLNA weak model). Combining models increased the performance, with positive and negative predictive values up to 80% and 84%, respectively. However, the number of substances that were predicted positive or negative for skin sensitization in both models was low. Adding more substances to the dataset will increase the confidence in the conclusions reached. The insights obtained in this evaluation are incorporated in a web database www.asopus.weebly.com that provides a potential end user context for the scope and performance of different in silico tools with respect to a common dataset of curated skin sensitization data.

  7. GenomeCAT: a versatile tool for the analysis and integrative visualization of DNA copy number variants.

    PubMed

    Tebel, Katrin; Boldt, Vivien; Steininger, Anne; Port, Matthias; Ebert, Grit; Ullmann, Reinhard

    2017-01-06

    The analysis of DNA copy number variants (CNV) has increasing impact in the field of genetic diagnostics and research. However, the interpretation of CNV data derived from high resolution array CGH or NGS platforms is complicated by the considerable variability of the human genome. Therefore, tools for multidimensional data analysis and comparison of patient cohorts are needed to assist in the discrimination of clinically relevant CNVs from others. We developed GenomeCAT, a standalone Java application for the analysis and integrative visualization of CNVs. GenomeCAT is composed of three modules dedicated to the inspection of single cases, comparative analysis of multidimensional data and group comparisons aiming at the identification of recurrent aberrations in patients sharing the same phenotype, respectively. Its flexible import options ease the comparative analysis of own results derived from microarray or NGS platforms with data from literature or public depositories. Multidimensional data obtained from different experiment types can be merged into a common data matrix to enable common visualization and analysis. All results are stored in the integrated MySQL database, but can also be exported as tab delimited files for further statistical calculations in external programs. GenomeCAT offers a broad spectrum of visualization and analysis tools that assist in the evaluation of CNVs in the context of other experiment data and annotations. The use of GenomeCAT does not require any specialized computer skills. The various R packages implemented for data analysis are fully integrated into GenomeCATs graphical user interface and the installation process is supported by a wizard. The flexibility in terms of data import and export in combination with the ability to create a common data matrix makes the program also well suited as an interface between genomic data from heterogeneous sources and external software tools. Due to the modular architecture the functionality of GenomeCAT can be easily extended by further R packages or customized plug-ins to meet future requirements.

  8. Google and Women's Health-Related Issues: What Does the Search Engine Data Reveal?

    PubMed

    Baazeem, Mazin; Abenhaim, Haim

    2014-01-01

    Identifying the gaps in public knowledge of women's health related issues has always been difficult. With the increasing number of Internet users in the United States, we sought to use the Internet as a tool to help us identify such gaps and to estimate women's most prevalent health concerns by examining commonly searched health-related keywords in Google search engine. We collected a large pool of possible search keywords from two independent practicing obstetrician/gynecologists and classified them into five main categories (obstetrics, gynecology, infertility, urogynecology/menopause and oncology), and measured the monthly average search volume within the United States for each keyword with all its possible combinations using Google AdWords tool. We found that pregnancy related keywords were less frequently searched in general compared to other categories with an average of 145,400 hits per month for the top twenty keywords. Among the most common pregnancy-related keywords was "pregnancy and sex' while pregnancy-related diseases were uncommonly searched. HPV alone was searched 305,400 times per month. Of the cancers affecting women, breast cancer was the most commonly searched with an average of 247,190 times per month, followed by cervical cancer then ovarian cancer. The commonly searched keywords are often issues that are not discussed in our daily practice as well as in public health messages. The search volume is relatively related to disease prevalence with the exception of ovarian cancer which could signify a public fear.

  9. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  10. Domains of psychosocial disability and mental disorders.

    PubMed

    Ro, Eunyoe; Watson, David; Clark, Lee Anna

    2018-06-07

    This study examined relations between comprehensive domains of psychosocial disability and mental disorders to determine (1) whether differential patterns of associations exist between psychosocial disability dimensions and commonly diagnosed mental disorders and (2) whether these relations differ between self-reported and interviewer-rated psychosocial disability domains. Self-reported and interviewer-rated psychosocial functioning measures and an interviewer-rated diagnostic assessment tool were administered to 181 psychiatric outpatients. Internalizing disorders showed the strongest and most pervasive associations with psychosocial impairment across both self-reported and interviewer-rated measures, followed by thought disorder; externalizing showed the weakest associations. More specifically, logistic regression analyses indicated that lower well-being factor score significantly increased the odds of distress-disorder diagnoses, and poor basic functioning increased the odds of PTSD. Results clearly showed differences in the magnitude of associations between three dimensions of psychosocial-disability and commonly diagnosed disorders, and that these differences were similar regardless of rater type. © 2018 Wiley Periodicals, Inc.

  11. Translating questionnaire items for a multi-lingual worker population: the iterative process of translation and cognitive interviews with English-, Spanish-, and Chinese-speaking workers.

    PubMed

    Fujishiro, Kaori; Gong, Fang; Baron, Sherry; Jacobson, C Jeffery; DeLaney, Sheli; Flynn, Michael; Eggerth, Donald E

    2010-02-01

    The increasing ethnic diversity of the US workforce has created a need for research tools that can be used with multi-lingual worker populations. Developing multi-language questionnaire items is a complex process; however, very little has been documented in the literature. Commonly used English items from the Job Content Questionnaire and Quality of Work Life Questionnaire were translated by two interdisciplinary bilingual teams and cognitively tested in interviews with English-, Spanish-, and Chinese-speaking workers. Common problems across languages mainly concerned response format. Language-specific problems required more conceptual than literal translations. Some items were better understood by non-English speakers than by English speakers. De-centering (i.e., modifying the English original to correspond with translation) produced better understanding for one item. Translating questionnaire items and achieving equivalence across languages require various kinds of expertise. Backward translation itself is not sufficient. More research efforts should be concentrated on qualitative approaches to developing useful research tools. Published 2009 Wiley-Liss, Inc.

  12. The minimalist grammar of action

    PubMed Central

    Pastra, Katerina; Aloimonos, Yiannis

    2012-01-01

    Language and action have been found to share a common neural basis and in particular a common ‘syntax’, an analogous hierarchical and compositional organization. While language structure analysis has led to the formulation of different grammatical formalisms and associated discriminative or generative computational models, the structure of action is still elusive and so are the related computational models. However, structuring action has important implications on action learning and generalization, in both human cognition research and computation. In this study, we present a biologically inspired generative grammar of action, which employs the structure-building operations and principles of Chomsky's Minimalist Programme as a reference model. In this grammar, action terminals combine hierarchically into temporal sequences of actions of increasing complexity; the actions are bound with the involved tools and affected objects and are governed by certain goals. We show, how the tool role and the affected-object role of an entity within an action drives the derivation of the action syntax in this grammar and controls recursion, merge and move, the latter being mechanisms that manifest themselves not only in human language, but in human action too. PMID:22106430

  13. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  14. Inventory on the dietary assessment tools available and needed in africa: a prerequisite for setting up a common methodological research infrastructure for nutritional surveillance, research, and prevention of diet-related non-communicable diseases.

    PubMed

    Pisa, Pedro T; Landais, Edwige; Margetts, Barrie; Vorster, Hester H; Friedenreich, Christine M; Huybrechts, Inge; Martin-Prevel, Yves; Branca, Francesco; Lee, Warren T K; Leclercq, Catherine; Jerling, Johann; Zotor, Francis; Amuna, Paul; Al Jawaldeh, Ayoub; Aderibigbe, Olaide Ruth; Amoussa, Waliou Hounkpatin; Anderson, Cheryl A M; Aounallah-Skhiri, Hajer; Atek, Madjid; Benhura, Chakare; Chifamba, Jephat; Covic, Namukolo; Dary, Omar; Delisle, Hélène; El Ati, Jalila; El Hamdouchi, Asmaa; El Rhazi, Karima; Faber, Mieke; Kalimbira, Alexander; Korkalo, Liisa; Kruger, Annamarie; Ledo, James; Machiweni, Tatenda; Mahachi, Carol; Mathe, Nonsikelelo; Mokori, Alex; Mouquet-Rivier, Claire; Mutie, Catherine; Nashandi, Hilde Liisa; Norris, Shane A; Onabanjo, Oluseye Olusegun; Rambeloson, Zo; Saha, Foudjo Brice U; Ubaoji, Kingsley Ikechukwu; Zaghloul, Sahar; Slimani, Nadia

    2018-01-02

    To carry out an inventory on the availability, challenges, and needs of dietary assessment (DA) methods in Africa as a pre-requisite to provide evidence, and set directions (strategies) for implementing common dietary methods and support web-research infrastructure across countries. The inventory was performed within the framework of the "Africa's Study on Physical Activity and Dietary Assessment Methods" (AS-PADAM) project. It involves international institutional and African networks. An inventory questionnaire was developed and disseminated through the networks. Eighteen countries responded to the dietary inventory questionnaire. Various DA tools were reported in Africa; 24-Hour Dietary Recall and Food Frequency Questionnaire were the most commonly used tools. Few tools were validated and tested for reliability. Face-to-face interview was the common method of administration. No computerized software or other new (web) technologies were reported. No tools were standardized across countries. The lack of comparable DA methods across represented countries is a major obstacle to implement comprehensive and joint nutrition-related programmes for surveillance, programme evaluation, research, and prevention. There is a need to develop new or adapt existing DA methods across countries by employing related research infrastructure that has been validated and standardized in other settings, with the view to standardizing methods for wider use.

  15. A New Continuous Cooling Transformation Diagram for AISI M4 High-Speed Tool Steel

    NASA Astrophysics Data System (ADS)

    Briki, Jalel; Ben Slima, Souad

    2008-12-01

    The increasing evolution of dilatometric techniques now allows for the identification of structural transformations with very low signal. The use of dilatometric techniques coupled with more common techniques, such as metallographic, hardness testing, and x-ray diffraction allows to plot a new CCT diagram for AISI M4 high-speed tool steel. This diagram is useful for a better selection of alternate solutions, hardening, and tempering heat treatments. More accurate determination of the various fields of transformation of austenite during its cooling was made. The precipitation of carbides highlighted at high temperature is at the origin of the martrensitic transformation into two stages (splitting phenomena). For slow cooling rates, it was possible to highlight the ferritic, pearlitic, and bainitic transformation.

  16. Surficial geological tools in fluvial geomorphology: Chapter 2

    USGS Publications Warehouse

    Jacobson, Robert B.; O'Connor, James E.; Oguchi, Takashi

    2016-01-01

    Increasingly, environmental scientists are being asked to develop an understanding of how rivers and streams have been altered by environmental stresses, whether rivers are subject to physical or chemical hazards, how they can be restored, and how they will respond to future environmental change. These questions present substantive challenges to the discipline of fluvial geomorphology, especially since decades of geomorphologic research have demonstrated the general complexity of fluvial systems. It follows from the concept of complex response that synoptic and short-term historical views of rivers will often give misleading understanding of future behavior. Nevertheless, broadly trained geomorphologists can address questions involving complex natural systems by drawing from a tool box that commonly includes the principles and methods of geology, hydrology, hydraulics, engineering, and ecology.

  17. Improved injection needles facilitate germline transformation of the buckeye butterfly Junonia coenia.

    PubMed

    Beaudette, Kahlia; Hughes, Tia M; Marcus, Jeffrey M

    2014-01-01

    Germline transformation with transposon vectors is an important tool for insect genetics, but progress in developing transformation protocols for butterflies has been limited by high post-injection ova mortality. Here we present an improved glass injection needle design for injecting butterfly ova that increases survival in three Nymphalid butterfly species. Using the needles to genetically transform the common buckeye butterfly Junonia coenia, the hatch rate for injected Junonia ova was 21.7%, the transformation rate was 3%, and the overall experimental efficiency was 0.327%, a substantial improvement over previous results in other butterfly species. Improved needle design and a higher efficiency of transformation should permit the deployment of transposon-based genetic tools in a broad range of less fecund lepidopteran species.

  18. G-CNV: A GPU-Based Tool for Preparing Data to Detect CNVs with Read-Depth Methods.

    PubMed

    Manconi, Andrea; Manca, Emanuele; Moscatelli, Marco; Gnocchi, Matteo; Orro, Alessandro; Armano, Giuliano; Milanesi, Luciano

    2015-01-01

    Copy number variations (CNVs) are the most prevalent types of structural variations (SVs) in the human genome and are involved in a wide range of common human diseases. Different computational methods have been devised to detect this type of SVs and to study how they are implicated in human diseases. Recently, computational methods based on high-throughput sequencing (HTS) are increasingly used. The majority of these methods focus on mapping short-read sequences generated from a donor against a reference genome to detect signatures distinctive of CNVs. In particular, read-depth based methods detect CNVs by analyzing genomic regions with significantly different read-depth from the other ones. The pipeline analysis of these methods consists of four main stages: (i) data preparation, (ii) data normalization, (iii) CNV regions identification, and (iv) copy number estimation. However, available tools do not support most of the operations required at the first two stages of this pipeline. Typically, they start the analysis by building the read-depth signal from pre-processed alignments. Therefore, third-party tools must be used to perform most of the preliminary operations required to build the read-depth signal. These data-intensive operations can be efficiently parallelized on graphics processing units (GPUs). In this article, we present G-CNV, a GPU-based tool devised to perform the common operations required at the first two stages of the analysis pipeline. G-CNV is able to filter low-quality read sequences, to mask low-quality nucleotides, to remove adapter sequences, to remove duplicated read sequences, to map the short-reads, to resolve multiple mapping ambiguities, to build the read-depth signal, and to normalize it. G-CNV can be efficiently used as a third-party tool able to prepare data for the subsequent read-depth signal generation and analysis. Moreover, it can also be integrated in CNV detection tools to generate read-depth signals.

  19. a Free and Open Source Tool to Assess the Accuracy of Land Cover Maps: Implementation and Application to Lombardy Region (italy)

    NASA Astrophysics Data System (ADS)

    Bratic, G.; Brovelli, M. A.; Molinari, M. E.

    2018-04-01

    The availability of thematic maps has significantly increased over the last few years. Validation of these maps is a key factor in assessing their suitability for different applications. The evaluation of the accuracy of classified data is carried out through a comparison with a reference dataset and the generation of a confusion matrix from which many quality indexes can be derived. In this work, an ad hoc free and open source Python tool was implemented to automatically compute all the matrix confusion-derived accuracy indexes proposed by literature. The tool was integrated into GRASS GIS environment and successfully applied to evaluate the quality of three high-resolution global datasets (GlobeLand30, Global Urban Footprint, Global Human Settlement Layer Built-Up Grid) in the Lombardy Region area (Italy). In addition to the most commonly used accuracy measures, e.g. overall accuracy and Kappa, the tool allowed to compute and investigate less known indexes such as the Ground Truth and the Classification Success Index. The promising tool will be further extended with spatial autocorrelation analysis functions and made available to researcher and user community.

  20. Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, Brian K; Nuttall, David; Cukier, Michael

    The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Moldsmore » are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.« less

  1. Teachers Connect "with" Technology: Online Tools Build New Pathways to Collaboration

    ERIC Educational Resources Information Center

    Phillips, Vicki L.; Olson, Lynn

    2013-01-01

    Teachers, curriculum experts, and other educators work together using online tools developed by the Bill & Melinda Gates Foundation to create high-quality, useful lessons and research-based instructional tools incorporating the Common Core State Standards.

  2. Development and evaluation of a comprehensive clinical decision support taxonomy: comparison of front-end tools in commercial and internally developed electronic health record systems

    PubMed Central

    Sittig, Dean F; Ash, Joan S; Feblowitz, Joshua; Meltzer, Seth; McMullen, Carmit; Guappone, Ken; Carpenter, Jim; Richardson, Joshua; Simonaitis, Linas; Evans, R Scott; Nichol, W Paul; Middleton, Blackford

    2011-01-01

    Background Clinical decision support (CDS) is a valuable tool for improving healthcare quality and lowering costs. However, there is no comprehensive taxonomy of types of CDS and there has been limited research on the availability of various CDS tools across current electronic health record (EHR) systems. Objective To develop and validate a taxonomy of front-end CDS tools and to assess support for these tools in major commercial and internally developed EHRs. Study design and methods We used a modified Delphi approach with a panel of 11 decision support experts to develop a taxonomy of 53 front-end CDS tools. Based on this taxonomy, a survey on CDS tools was sent to a purposive sample of commercial EHR vendors (n=9) and leading healthcare institutions with internally developed state-of-the-art EHRs (n=4). Results Responses were received from all healthcare institutions and 7 of 9 EHR vendors (response rate: 85%). All 53 types of CDS tools identified in the taxonomy were found in at least one surveyed EHR system, but only 8 functions were present in all EHRs. Medication dosing support and order facilitators were the most commonly available classes of decision support, while expert systems (eg, diagnostic decision support, ventilator management suggestions) were the least common. Conclusion We developed and validated a comprehensive taxonomy of front-end CDS tools. A subsequent survey of commercial EHR vendors and leading healthcare institutions revealed a small core set of common CDS tools, but identified significant variability in the remainder of clinical decision support content. PMID:21415065

  3. A Case for Data Commons

    PubMed Central

    Grossman, Robert L.; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt

    2017-01-01

    Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons. PMID:29033693

  4. Sharing knowledge about immunisation (SKAI): An exploration of parents' communication needs to inform development of a clinical communication support intervention.

    PubMed

    Berry, Nina J; Danchin, Margie; Trevena, Lyndal; Witteman, Holly O; Kinnersley, Paul; Snelling, Tom; Robinson, Penelope; Leask, Julie

    2018-01-29

    The SKAI (Sharing Knowledge About Immunisation) project aims to develop effective communication tools to support primary health care providers' consultations with parents who may be hesitant about vaccinating their children. This study explored parents' communication needs using a qualitative design. Parents of at least one child less than five years old were recruited from two major cities and a regional town known for high prevalence of vaccine objection. Focus groups of parents who held similar vaccination attitudes and intentions were convened to discuss experiences of vaccination consultations and explore their communication needs, including preferences. Draft written communication support tools were used to stimulate discussion and gauge acceptability of the tools. Important differences in communication needs between group types emerged. The least hesitant parent groups reported feeling reassured upon reading resources designed to address commonly observed concerns about vaccination. As hesitancy of the group members increased, so did their accounts of the volume and detail of information they required. Trust appeared to be related to apparent or perceived transparency. More hesitant groups displayed increased sensitivity and resistance to persuasive language forms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Lessons from the Johns Hopkins Multi-Disciplinary Venous Thromboembolism (VTE) Prevention Collaborative

    PubMed Central

    Streiff, Michael B; Carolan, Howard T; Hobson, Deborah B; Kraus, Peggy S; Holzmueller, Christine G; Demski, Renee; Lau, Brandyn D; Biscup-Horn, Paula; Pronovost, Peter J

    2012-01-01

    Problem Venous thromboembolism (VTE) is a common cause of potentially preventable mortality, morbidity, and increased medical costs. Risk-appropriate prophylaxis can prevent most VTE events, but only a small fraction of patients at risk receive this treatment. Design Prospective quality improvement programme. Setting Johns Hopkins Hospital, Baltimore, Maryland, USA. Strategies for change A multidisciplinary team established a VTE Prevention Collaborative in 2005. The collaborative applied the four step TRIP (translating research into practice) model to develop and implement a mandatory clinical decision support tool for VTE risk stratification and risk-appropriate VTE prophylaxis for all hospitalised adult patients. Initially, paper based VTE order sets were implemented, which were then converted into 16 specialty-specific, mandatory, computerised, clinical decision support modules. Key measures for improvement VTE risk stratification within 24 hours of hospital admission and provision of risk-appropriate, evidence based VTE prophylaxis. Effects of change The VTE team was able to increase VTE risk assessment and ordering of risk-appropriate prophylaxis with paper based order sets to a limited extent, but achieved higher compliance with a computerised clinical decision support tool and the data feedback which it enabled. Risk-appropriate VTE prophylaxis increased from 26% to 80% for surgical patients and from 25% to 92% for medical patients in 2011. Lessons learnt A computerised clinical decision support tool can increase VTE risk stratification and risk-appropriate VTE prophylaxis among hospitalised adult patients admitted to a large urban academic medical centre. It is important to ensure the tool is part of the clinician’s normal workflow, is mandatory (computerised forcing function), and offers the requisite modules needed for every clinical specialty. PMID:22718994

  6. The image-interpretation-workstation of the future: lessons learned

    NASA Astrophysics Data System (ADS)

    Maier, S.; van de Camp, F.; Hafermann, J.; Wagner, B.; Peinsipp-Byma, E.; Beyerer, J.

    2017-05-01

    In recent years, professionally used workstations got increasingly complex and multi-monitor systems are more and more common. Novel interaction techniques like gesture recognition were developed but used mostly for entertainment and gaming purposes. These human computer interfaces are not yet widely used in professional environments where they could greatly improve the user experience. To approach this problem, we combined existing tools in our imageinterpretation-workstation of the future, a multi-monitor workplace comprised of four screens. Each screen is dedicated to a special task in the image interpreting process: a geo-information system to geo-reference the images and provide a spatial reference for the user, an interactive recognition support tool, an annotation tool and a reporting tool. To further support the complex task of image interpreting, self-developed interaction systems for head-pose estimation and hand tracking were used in addition to more common technologies like touchscreens, face identification and speech recognition. A set of experiments were conducted to evaluate the usability of the different interaction systems. Two typical extensive tasks of image interpreting were devised and approved by military personal. They were then tested with a current setup of an image interpreting workstation using only keyboard and mouse against our image-interpretationworkstation of the future. To get a more detailed look at the usefulness of the interaction techniques in a multi-monitorsetup, the hand tracking, head pose estimation and the face recognition were further evaluated using tests inspired by everyday tasks. The results of the evaluation and the discussion are presented in this paper.

  7. Bioinformatics Approaches to Classifying Allergens and Predicting Cross-Reactivity

    PubMed Central

    Schein, Catherine H.; Ivanciuc, Ovidiu; Braun, Werner

    2007-01-01

    The major advances in understanding why patients respond to several seemingly different stimuli have been through the isolation, sequencing and structural analysis of proteins that induce an IgE response. The most significant finding is that allergenic proteins from very different sources can have nearly identical sequences and structures, and that this similarity can account for clinically observed cross-reactivity. The increasing amount of information on the sequence, structure and IgE epitopes of allergens is now available in several databases and powerful bioinformatics search tools allow user access to relevant information. Here, we provide an overview of these databases and describe state-of-the art bioinformatics tools to identify the common proteins that may be at the root of multiple allergy syndromes. Progress has also been made in quantitatively defining characteristics that discriminate allergens from non-allergens. Search and software tools for this purpose have been developed and implemented in the Structural Database of Allergenic Proteins (SDAP, http://fermi.utmb.edu/SDAP/). SDAP contains information for over 800 allergens and extensive bibliographic references in a relational database with links to other publicly available databases. SDAP is freely available on the Web to clinicians and patients, and can be used to find structural and functional relations among known allergens and to identify potentially cross-reacting antigens. Here we illustrate how these bioinformatics tools can be used to group allergens, and to detect areas that may account for common patterns of IgE binding and cross-reactivity. Such results can be used to guide treatment regimens for allergy sufferers. PMID:17276876

  8. The Effect of a Mechanical Arm System on Portable Grinder Vibration Emissions.

    PubMed

    McDowell, Thomas W; Welcome, Daniel E; Warren, Christopher; Xu, Xueyan S; Dong, Ren G

    2016-04-01

    Mechanical arm systems are commonly used to support powered hand tools to alleviate ergonomic stressors related to the development of workplace musculoskeletal disorders. However, the use of these systems can increase exposure times to other potentially harmful agents such as hand-transmitted vibration. To examine how these tool support systems affect tool vibration, the primary objectives of this study were to characterize the vibration emissions of typical portable pneumatic grinders used for surface grinding with and without a mechanical arm support system at a workplace and to estimate the potential risk of the increased vibration exposure time afforded by the use of these mechanical arm systems. This study also developed a laboratory-based simulated grinding task based on the ISO 28927-1 (2009) standard for assessing grinder vibrations; the simulated grinding vibrations were compared with those measured during actual workplace grinder operations. The results of this study demonstrate that use of the mechanical arm may provide a health benefit by reducing the forces required to lift and maneuver the tools and by decreasing hand-transmitted vibration exposure. However, the arm does not substantially change the basic characteristics of grinder vibration spectra. The mechanical arm reduced the average frequency-weighted acceleration by about 24% in the workplace and by about 7% in the laboratory. Because use of the mechanical arm system can increase daily time-on-task by 50% or more, the use of such systems may actually increase daily time-weighted hand-transmitted vibration exposures in some cases. The laboratory acceleration measurements were substantially lower than the workplace measurements, and the laboratory tool rankings based on acceleration were considerably different than those from the workplace. Thus, it is doubtful that ISO 28927-1 is useful for estimating workplace grinder vibration exposures or for predicting workplace grinder acceleration rank orders. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2015.

  9. The Effect of a Mechanical Arm System on Portable Grinder Vibration Emissions

    PubMed Central

    McDowell, Thomas W.; Welcome, Daniel E.; Warren, Christopher; Xu, Xueyan S.; Dong, Ren G.

    2016-01-01

    Mechanical arm systems are commonly used to support powered hand tools to alleviate ergonomic stressors related to the development of workplace musculoskeletal disorders. However, the use of these systems can increase exposure times to other potentially harmful agents such as hand-transmitted vibration. To examine how these tool support systems affect tool vibration, the primary objectives of this study were to characterize the vibration emissions of typical portable pneumatic grinders used for surface grinding with and without a mechanical arm support system at a workplace and to estimate the potential risk of the increased vibration exposure time afforded by the use of these mechanical arm systems. This study also developed a laboratory-based simulated grinding task based on the ISO 28927-1 (2009) standard for assessing grinder vibrations; the simulated grinding vibrations were compared with those measured during actual workplace grinder operations. The results of this study demonstrate that use of the mechanical arm may provide a health benefit by reducing the forces required to lift and maneuver the tools and by decreasing hand-transmitted vibration exposure. However, the arm does not substantially change the basic characteristics of grinder vibration spectra. The mechanical arm reduced the average frequency-weighted acceleration by about 24% in the workplace and by about 7% in the laboratory. Because use of the mechanical arm system can increase daily time-on-task by 50% or more, the use of such systems may actually increase daily time-weighted hand-transmitted vibration exposures in some cases. The laboratory acceleration measurements were substantially lower than the workplace measurements, and the laboratory tool rankings based on acceleration were considerably different than those from the workplace. Thus, it is doubtful that ISO 28927-1 is useful for estimating workplace grinder vibration exposures or for predicting workplace grinder acceleration rank orders. PMID:26628522

  10. SNPConvert: SNP Array Standardization and Integration in Livestock Species.

    PubMed

    Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra

    2016-06-09

    One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.

  11. Improving Students' Understanding of Quantum Measurement. II. Development of Research-Based Learning Tools

    ERIC Educational Resources Information Center

    Zhu, Guangtian; Singh, Chandralekha

    2012-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students'…

  12. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  13. Forming Tool Use Representations: A Neurophysiological Investigation into Tool Exposure

    ERIC Educational Resources Information Center

    Mizelle, John Christopher; Tang, Teresa; Pirouz, Nikta; Wheaton, Lewis A.

    2011-01-01

    Prior work has identified a common left parietofrontal network for storage of tool-related information for various tasks. How these representations become established within this network on the basis of different modes of exposure is unclear. Here, healthy subjects engaged in physical practice (direct exposure) with familiar and unfamiliar tools.…

  14. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    NASA Astrophysics Data System (ADS)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  15. Aligning "TextEvaluator"® Scores with the Accelerated Text Complexity Guidelines Specified in the Common Core State Standards. Research Report. ETS RR-15-21

    ERIC Educational Resources Information Center

    Sheehan, Kathleen M.

    2015-01-01

    The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers, curriculum specialists, textbook publishers, and test developers select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards.This paper documents the procedure used…

  16. 49 CFR 1242.28 - Roadway machines, small tools and supplies, and snow removal (accounts XX-19-36 to XX-19-38...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... snow removal (accounts XX-19-36 to XX-19-38, inclusive). 1242.28 Section 1242.28 Transportation Other... tools and supplies, and snow removal (accounts XX-19-36 to XX-19-38, inclusive). Separate common expenses according to distribution of common expenses listed in § 1242.10, Administration—Track (account XX...

  17. Kidwatching: A Vygotskyan Approach to Children's Language In the "Star Wars" Age.

    ERIC Educational Resources Information Center

    Monroe, Suzanne S.

    A Vygotskyan review of children's language examines language samples of a 7-year-old boy at home, at a birthday party, and at play in a sandbox. The language samples indicate common patterns, including his use of tools and symbol together in play. A common thread in the samples is his involvement with high tech tools of futuristic toys. Vygotsky…

  18. The Relationship between the Physical Therapist Clinical Performance Instrument Scores and Doctor of Physical Therapy Student Learning Styles

    ERIC Educational Resources Information Center

    Courtright, Joachim

    2017-01-01

    INTRODUCTION. The learning style of a student is an important factor in their ability to gain knowledge. This is especially important in challenging curriculums such as the Doctor of Physical Therapy (DPT) program. A common tool to assess one's learning style is The Kolb Learning Styles Inventory (LSI). A common tool used to measure the…

  19. Common Evaluation Tools across Multi-State Programs: A Study of Parenting Education and Youth Engagement Programs in Children, Youth, and Families At-Risk

    ERIC Educational Resources Information Center

    Payne, Pamela B.; McDonald, Daniel A.

    2015-01-01

    Community-based education programs must demonstrate effectiveness to various funding sources. The pilot study reported here (funded by CYFAR, NIFA, USDA award #2008-41520-04810) had the goal of determining if state level programs with varied curriculum could use a common evaluation tool to demonstrate efficacy. Results in parenting and youth…

  20. Virtual Mobility in Reality: A Study of the Use of ICT in Finnish Leonardo da Vinci Mobility Projects.

    ERIC Educational Resources Information Center

    Valjus, Sonja

    An e-mail survey and interviews collected data on use of information and communications technology (ICT) in Finnish Leonardo da Vinci mobility projects from 2000-02. Findings showed that the most common ICT tools used were e-mail, digital tools, and the World Wide Web; ICT was used during all project phases; the most common problems concerned…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brouillette, Greg A.

    These are slides for various presentations on C41SR and urban disasters disasters response and recovery tools. These are all mainly charts and images of disaster response and recovery tools. Slides included have headings such as the following: vignette of a disaster response, situational awareness and common operating picture available to EOC, plume modeling capability, Program ASPECT Chemical Response Products, EPA ASPECT - Hurricane RITA Response 9/25/2005, Angel Fire Imagery, incident commander's view/police chief's view/ EMS' view, common situational awareness and collaborative planning, exercise, training capability, systems diagram, Austere Challenge 06 Sim/C4 Requirements, common situational awareness and collaborative planning, exercise, trainingmore » environment, common situational awareness, real world, crisis response, and consequence management.« less

  2. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  3. Bioaccumulation of trace element concentrations in common dolphins (Delphinus delphis) from Portugal.

    PubMed

    Monteiro, Sílvia S; Pereira, Andreia T; Costa, Élia; Torres, Jordi; Oliveira, Isabel; Bastos-Santos, Jorge; Araújo, Helder; Ferreira, Marisa; Vingada, José; Eira, Catarina

    2016-12-15

    The common dolphin (Delphinus delphis) is one of the most abundant species in Atlantic Iberia, representing a potentially important tool to assess the bioaccumulation of trace elements in the Iberian marine ecosystem. Nine elements (As, Cd, Cu, Hg, Mn, Ni, Pb, Se and Zn) were evaluated in 36 dolphins stranded in continental Portugal. Dolphins had increasing Hg concentrations (16.72μg·g -1 ww, liver) compared with previous studies in Atlantic Iberia, whereas Cd concentrations (2.26μg·g -1 ww, kidney) fell within reported ranges. The concentrations of some trace elements (including Cd and Hg) presented positive relationships with dolphin length, presence of parasites and gross pathologies. Common dolphins may help biomonitoring more offshore Atlantic Iberian areas in future studies, which would otherwise be difficult to assess. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Screening and Evaluation Tool (SET) Users Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pincock, Layne

    This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.

  5. Utilization of smoking cessation informational, interactive, and online community resources as predictors of abstinence: cohort study.

    PubMed

    An, Lawrence C; Schillo, Barbara A; Saul, Jessie E; Wendling, Ann H; Klatt, Colleen M; Berg, Carla J; Ahulwalia, Jasjit S; Kavanaugh, Annette M; Christenson, Matthew; Luxenberg, Michael G

    2008-12-20

    The association between greater utilization of Web-assisted tobacco interventions and increased abstinence rates is well recognized. However, there is little information on how utilization of specific website features influences quitting. To determine the association between utilization of informational, interactive, and online community resources (eg. bulletin boards) and abstinence rates, with the broader objective to identify potential strategies for improving outcomes for Web-assisted tobacco interventions. In Spring 2004, a cohort of 607 quitplan.com users consented to participate in an evaluation of quitplan.com, a Minnesota branded version of QuitNet.com. We developed utilization measures for different site features: general information, interactive diagnostic tools and quit planning tools, online expert counseling, passive (ie, reading of bulletin boards) and active (ie, public posting) online community engagement, and one-to-one messaging with other virtual community members. Using bivariate, multivariate, and path analyses, we examined the relationship between utilization of specific site features and 30-day abstinence at 6 months. The most commonly used resources were the interactive quit planning tools (used by 77% of site users). Other informational resources (ie, quitting guides) were used more commonly (60% of users) than passive (38%) or active (24%) community features. Online community engagement through one-to-one messaging was low (11%) as was use of online counseling (5%). The 30-day abstinence rate among study participants at 6 months was 9.7% (95% Confidence Interval [CI] 7.3% - 12.1%). In the logistic regression model, neither the demographic data (eg, age, gender, education level, employment, or insurance status) nor the smoking-related data (eg, cigarettes per day, time to first morning cigarette, baseline readiness to quit) nor use of smoking cessation medications entered the model as significant predictors of abstinence. Individuals who used the interactive quit planning tools once, two to three times, or four or more times had an odds of abstinence of 0.65 (95% Confidence Interval [CI] 0.22 - 1.94), 1.87 (95% CI 0.77 - 4.56), and 2.35 (95% CI 1.0 - 5.58), respectively. The use of one-to-one messages (reference = none vs 1 or more) entered the final model as potential predictor for abstinence, though the significance of this measure was marginal (OR = 1.91, 95% CI 0.92 - 3.97, P = .083). In the path analysis, an apparent association between active online community engagement and abstinence was accounted for in large part by increased use of interactive quitting tools and one-to-one messaging. Use of interactive quitting tools, and perhaps one-to-one messaging with other members of the online community, was associated with increased abstinence rates among quitplan.com users. Designs that facilitate use of these features should be considered.

  6. Versatile Friction Stir Welding/Friction Plug Welding System

    NASA Technical Reports Server (NTRS)

    Carter, Robert

    2006-01-01

    A proposed system of tooling, machinery, and control equipment would be capable of performing any of several friction stir welding (FSW) and friction plug welding (FPW) operations. These operations would include the following: Basic FSW; FSW with automated manipulation of the length of the pin tool in real time [the so-called auto-adjustable pin-tool (APT) capability]; Self-reacting FSW (SRFSW); SR-FSW with APT capability and/or real-time adjustment of the distance between the front and back shoulders; and Friction plug welding (FPW) [more specifically, friction push plug welding] or friction pull plug welding (FPPW) to close out the keyhole of, or to repair, an FSW or SR-FSW weld. Prior FSW and FPW systems have been capable of performing one or two of these operations, but none has thus far been capable of performing all of them. The proposed system would include a common tool that would have APT capability for both basic FSW and SR-FSW. Such a tool was described in Tool for Two Types of Friction Stir Welding (MFS- 31647-1), NASA Tech Briefs, Vol. 30, No. 10 (October 2006), page 70. Going beyond what was reported in the cited previous article, the common tool could be used in conjunction with a plug welding head to perform FPW or FPPW. Alternatively, the plug welding head could be integrated, along with the common tool, into a FSW head that would be capable of all of the aforementioned FSW and FPW operations. Any FSW or FPW operation could be performed under any combination of position and/or force control.

  7. EUnetHTA information management system: development and lessons learned.

    PubMed

    Chalon, Patrice X; Kraemer, Peter

    2014-11-01

    The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.

  8. Common bean proteomics: Present status and future strategies.

    PubMed

    Zargar, Sajad Majeed; Mahajan, Reetika; Nazir, Muslima; Nagar, Preeti; Kim, Sun Tae; Rai, Vandna; Masi, Antonio; Ahmad, Syed Mudasir; Shah, Riaz Ahmad; Ganai, Nazir Ahmad; Agrawal, Ganesh K; Rakwal, Randeep

    2017-10-03

    Common bean (Phaseolus vulgaris L.) is a legume of appreciable importance and usefulness worldwide to the human population providing food and feed. It is rich in high-quality protein, energy, fiber and micronutrients especially iron, zinc, and pro-vitamin A; and possesses potentially disease-preventing and health-promoting compounds. The recently published genome sequence of common bean is an important landmark in common bean research, opening new avenues for understanding its genetics in depth. This legume crop is affected by diverse biotic and abiotic stresses severely limiting its productivity. Looking at the trend of increasing world population and the need for food crops best suited to the health of humankind, the legumes will be in great demand, including the common bean mostly for its nutritive values. Hence the need for new research in understanding the biology of this crop brings us to utilize and apply high-throughput omics approaches. In this mini-review our focus will be on the need for proteomics studies in common bean, potential of proteomics for understanding genetic regulation under abiotic and biotic stresses and how proteogenomics will lead to nutritional improvement. We will also discuss future proteomics-based strategies that must be adopted to mine new genomic resources by identifying molecular switches regulating various biological processes. Common bean is regarded as "grain of hope" for the poor, being rich in high-quality protein, energy, fiber and micronutrients (iron, zinc, pro-vitamin A); and possesses potentially disease-preventing and health-promoting compounds. Increasing world population and the need for food crops best suited to the health of humankind, puts legumes into great demand, which includes the common bean mostly. An important landmark in common bean research was the recent publication of its genome sequence, opening new avenues for understanding its genetics in depth. This legume crop is affected by diverse biotic and abiotic stresses severely limiting its productivity. Therefore, the need for new research in understanding the biology of this crop brings us to utilize and apply high-throughput omics approaches. Proteomics can be used to track all the candidate proteins/genes responsible for a biological process under specific conditions in a particular tissue. The potential of proteomics will not only help in determining the functions of a large number of genes in a single experiment but will also be a useful tool to mine new genes that can provide solution to various problems (abiotic stress, biotic stress, nutritional improvement, etc). We believe that a combined approach including breeding along with omics tools will lead towards attaining sustainability in legumes, including common bean. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Surfing on the morning after: analysis of an emergency contraception website.

    PubMed

    Gainer, Erin; Sollet, Christian; Ulmann, Marion; Lévy, Delphine; Ulmann, André

    2003-03-01

    The introduction of widespread nonprescription delivery of hormonal emergency contraception (EC) calls for development of innovative tools to provide information to and gather feedback from EC users. Individuals seeking confidential information on sexual health and contraception are increasingly turning to the Internet as the resource of choice. This study employed analytical software and manual content analysis to examine the use of a website dedicated to an EC product (www.norlevo.com) over the course of 2 years. Frequency of visits to and pageviews of the site increased consistently over the 2-year time period, and the bulk of the visitors to the site were EC users seeking responses to frequently asked questions. The most common concern raised by users was the occurrence of spotting and menstrual bleeding following EC use. This analysis reveals that within the context of nonprescription access to hormonal EC, a website can constitute a potent educational tool for health professionals and EC users and provide a valuable source of post-marketing feedback on product use.

  10. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  11. Self-reflection as a Tool to Increase Hospitalist Participation in Readmission Quality Improvement.

    PubMed

    Rana, Vipulkumar; Thapa, Bipin; Saini, Sumanta Chaudhuri; Nagpal, Pooja; Segon, Ankur; Fletcher, Kathlyn; Lamb, Geoffrey

    Reducing 30-day readmissions is a national priority. Although multipronged programs have been shown to reduce readmissions, the role of the individual hospitalist physician in reducing readmissions is not clear. We evaluated the effect of physicians' self-review of their own readmission cases on the 30-day readmission rate. Over a 1-year period, hospitalists were sent their individual readmission rates and cases on a weekly basis. They reviewed their cases and completed a data abstraction tool. In addition, a facilitator led small group discussion about common causes of readmission and ways to prevent such readmissions. Our preintervention readmission rate was 16.16% and postintervention was 14.99% (P = .76). Among hospitalists on duty, nearly all participated in scheduled facilitated discussions. Self-review was completed in 67% of the cases. A facilitated reflective practice intervention increased hospitalist participation and awareness in the mission to reduce readmissions and this intervention resulted in a nonsignificant trend in readmission reduction.

  12. Physical activity assessment tools for use in overweight and obese children.

    PubMed

    Ellery, C V L; Weiler, H A; Hazell, T J

    2014-01-01

    The prevalence of excess weight in children and adults worldwide has increased rapidly in the last 25 years. Obesity is positively associated with increased risk for many health issues such as type 2 diabetes, cardiovascular disease and psychosocial problems. This review focuses on child populations, as it is known that the sedentary behaviors of overweight/obese youth often endure into adulthood. Assessment of physical activity (PA), among other factors such as diet and socio-economic status, is important in understanding weight variation and in designing interventions. This review highlights common subjective and objective PA assessment tools, the validity of these methods and acceptable ways of collecting and interpreting PA data. The aim is to provide an update on PA assessment in overweight/obese children, highlighting current knowledge and any gaps in the literature, in order to facilitate the use of PA assessments and interventions by health-care professionals as well as suggest future research in this area.

  13. Co-authorship network analysis in health research: method and potential use.

    PubMed

    Fonseca, Bruna de Paula Fonseca E; Sampaio, Ricardo Barros; Fonseca, Marcus Vinicius de Araújo; Zicker, Fabio

    2016-04-30

    Scientific collaboration networks are a hallmark of contemporary academic research. Researchers are no longer independent players, but members of teams that bring together complementary skills and multidisciplinary approaches around common goals. Social network analysis and co-authorship networks are increasingly used as powerful tools to assess collaboration trends and to identify leading scientists and organizations. The analysis reveals the social structure of the networks by identifying actors and their connections. This article reviews the method and potential applications of co-authorship network analysis in health. The basic steps for conducting co-authorship studies in health research are described and common network metrics are presented. The application of the method is exemplified by an overview of the global research network for Chikungunya virus vaccines.

  14. An Overview of Tools for Creating, Validating and Using PDS Metadata

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.

    2017-12-01

    NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.

  15. Google and Women’s Health-Related Issues: What Does the Search Engine Data Reveal?

    PubMed Central

    Baazeem, Mazin

    2014-01-01

    Objectives Identifying the gaps in public knowledge of women’s health related issues has always been difficult. With the increasing number of Internet users in the United States, we sought to use the Internet as a tool to help us identify such gaps and to estimate women’s most prevalent health concerns by examining commonly searched health-related keywords in Google search engine. Methods We collected a large pool of possible search keywords from two independent practicing obstetrician/gynecologists and classified them into five main categories (obstetrics, gynecology, infertility, urogynecology/menopause and oncology), and measured the monthly average search volume within the United States for each keyword with all its possible combinations using Google AdWords tool. Results We found that pregnancy related keywords were less frequently searched in general compared to other categories with an average of 145,400 hits per month for the top twenty keywords. Among the most common pregnancy-related keywords was “pregnancy and sex’ while pregnancy-related diseases were uncommonly searched. HPV alone was searched 305,400 times per month. Of the cancers affecting women, breast cancer was the most commonly searched with an average of 247,190 times per month, followed by cervical cancer then ovarian cancer. Conclusion The commonly searched keywords are often issues that are not discussed in our daily practice as well as in public health messages. The search volume is relatively related to disease prevalence with the exception of ovarian cancer which could signify a public fear. PMID:25422723

  16. Clinical Utility of Quantitative Imaging

    PubMed Central

    Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Subramaniam, Rathan M.; Lenchik, Leon

    2014-01-01

    Quantitative imaging (QI) is increasingly applied in modern radiology practice, assisting in the clinical assessment of many patients and providing a source of biomarkers for a spectrum of diseases. QI is commonly used to inform patient diagnosis or prognosis, determine the choice of therapy, or monitor therapy response. Because most radiologists will likely implement some QI tools to meet the patient care needs of their referring clinicians, it is important for all radiologists to become familiar with the strengths and limitations of QI. The Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force has explored the clinical application of QI and summarizes its work in this review. We provide an overview of the clinical use of QI by discussing QI tools that are currently employed in clinical practice, clinical applications of these tools, approaches to reporting of QI, and challenges to implementing QI. It is hoped that these insights will help radiologists recognize the tangible benefits of QI to their patients, their referring clinicians, and their own radiology practice. PMID:25442800

  17. Geochemical Reaction Mechanism Discovery from Molecular Simulation

    DOE PAGES

    Stack, Andrew G.; Kent, Paul R. C.

    2014-11-10

    Methods to explore reactions using computer simulation are becoming increasingly quantitative, versatile, and robust. In this review, a rationale for how molecular simulation can help build better geochemical kinetics models is first given. We summarize some common methods that geochemists use to simulate reaction mechanisms, specifically classical molecular dynamics and quantum chemical methods and discuss their strengths and weaknesses. Useful tools such as umbrella sampling and metadynamics that enable one to explore reactions are discussed. Several case studies wherein geochemists have used these tools to understand reaction mechanisms are presented, including water exchange and sorption on aqueous species and mineralmore » surfaces, surface charging, crystal growth and dissolution, and electron transfer. The impact that molecular simulation has had on our understanding of geochemical reactivity are highlighted in each case. In the future, it is anticipated that molecular simulation of geochemical reaction mechanisms will become more commonplace as a tool to validate and interpret experimental data, and provide a check on the plausibility of geochemical kinetic models.« less

  18. VISIT-TS: A multimedia tool for population studies on tic disorders.

    PubMed

    Vachon, M Jonathan; Striley, Catherine W; Gordon, Mollie R; Schroeder, Miriam L; Bihun, Emily C; Koller, Jonathan M; Black, Kevin J

    2016-01-01

    Population-based assessment of Tourette syndrome (TS) and other tic disorders produces a paradox. On one hand, ideally diagnosis of tic disorders requires expert observation. In fact, diagnostic criteria for TS explicitly require expert assessment of tics for a definite diagnosis. On the other hand, large-scale population surveys with expert assessment of every subject are impracticable. True, several published studies have successfully used expert assessment to find tic prevalence in a representative population (e.g. all students in a school district). However, extending these studies to larger populations is daunting. We created a multimedia tool to demonstrate tics to a lay audience, discuss their defining and common attributes, and address features that differentiate tics from other movements and vocalizations. A first version was modified to improve clarity and to include a more diverse group in terms of age and ethnicity. The result is a tool intended for epidemiological research. It may also provide additional benefits, such as more representative minority recruitment for other TS studies and increased community awareness of TS.

  19. Studying mechanism of radical reactions: From radiation to nitroxides as research tools

    NASA Astrophysics Data System (ADS)

    Maimon, Eric; Samuni, Uri; Goldstein, Sara

    2018-02-01

    Radicals are part of the chemistry of life, and ionizing radiation chemistry serves as an indispensable research tool for elucidation of the mechanism(s) underlying their reactions. The ever-increasing understanding of their involvement in diverse physiological and pathological processes has expanded the search for compounds that can diminish radical-induced damage. This review surveys the areas of research focusing on radical reactions and particularly with stable cyclic nitroxide radicals, which demonstrate unique antioxidative activities. Unlike common antioxidants that are progressively depleted under oxidative stress and yield secondary radicals, nitroxides are efficient radical scavengers yielding in most cases their respective oxoammonium cations, which are readily reduced back in the tissue to the nitroxide thus continuously being recycled. Nitroxides, which not only protect enzymes, cells, and laboratory animals from diverse kinds of biological injury, but also modify the catalytic activity of heme enzymes, could be utilized in chemical and biological systems serving as a research tool for elucidating mechanisms underlying complex chemical and biochemical processes.

  20. Single molecule tools for enzymology, structural biology, systems biology and nanotechnology: an update

    PubMed Central

    Widom, Julia R.; Dhakal, Soma; Heinicke, Laurie A.; Walter, Nils G.

    2015-01-01

    Toxicology is the highly interdisciplinary field studying the adverse effects of chemicals on living organisms. It requires sensitive tools to detect such effects. After their initial implementation during the 1990s, single-molecule fluorescence detection tools were quickly recognized for their potential to contribute greatly to many different areas of scientific inquiry. In the intervening time, technical advances in the field have generated ever-improving spatial and temporal resolution, and have enabled the application of single-molecule fluorescence to increasingly complex systems, such as live cells. In this review, we give an overview of the optical components necessary to implement the most common versions of single-molecule fluorescence detection. We then discuss current applications to enzymology and structural studies, systems biology, and nanotechnology, presenting the technical considerations that are unique to each area of study, along with noteworthy recent results. We also highlight future directions that have the potential to revolutionize these areas of study by further exploiting the capabilities of single-molecule fluorescence microscopy. PMID:25212907

  1. On constraining pilot point calibration with regularization in PEST

    USGS Publications Warehouse

    Fienen, M.N.; Muffels, C.T.; Hunt, R.J.

    2009-01-01

    Ground water model calibration has made great advances in recent years with practical tools such as PEST being instrumental for making the latest techniques available to practitioners. As models and calibration tools get more sophisticated, however, the power of these tools can be misapplied, resulting in poor parameter estimates and/or nonoptimally calibrated models that do not suit their intended purpose. Here, we focus on an increasingly common technique for calibrating highly parameterized numerical models - pilot point parameterization with Tikhonov regularization. Pilot points are a popular method for spatially parameterizing complex hydrogeologic systems; however, additional flexibility offered by pilot points can become problematic if not constrained by Tikhonov regularization. The objective of this work is to explain and illustrate the specific roles played by control variables in the PEST software for Tikhonov regularization applied to pilot points. A recent study encountered difficulties implementing this approach, but through examination of that analysis, insight into underlying sources of potential misapplication can be gained and some guidelines for overcoming them developed. ?? 2009 National Ground Water Association.

  2. Developing a biomonitoring tool for fine sediment

    NASA Astrophysics Data System (ADS)

    Turley, Matt; Bilotta, Gary; Brazier, Richard; Extence, Chris

    2014-05-01

    Sediment is an essential component of freshwater ecosystems; however anthropogenic activities can lead to elevated sediment delivery which can impact on the physical, chemical and biological characteristics of these ecosystems. Ultimately, this can result in a loss of ecosystem services worth more than 1.7 trillion per annum. As such it is important that sediment, which is one of the most commonly attributed causes of water quality impairment globally, is managed in order to minimise these impacts. The current EU environmental quality standard for sediment (monitored in the form of suspended solids) is 25 mg L-1 for all environments. It is widely recognised that this standard is unsuitable and not ecologically relevant. Furthermore, it requires a substantial resource investment to monitor sediment in this form as part of national and international water resource legislation. In recognition of this the development of sediment-specific biomonitoring tools is receiving increasing attention. The Proportion of Sediment-Sensitive Invertebrates (PSI) index is one such tool that is designed to indicate levels of fine sediment (

  3. Circulating Cell Free Tumor DNA Detection as a Routine Tool for Lung Cancer Patient Management

    PubMed Central

    Vendrell, Julie A.; Mau-Them, Frédéric Tran; Béganton, Benoît; Godreuil, Sylvain; Coopman, Peter; Solassol, Jérôme

    2017-01-01

    Circulating tumoral DNA (ctDNA), commonly named “liquid biopsy”, has emerged as a new promising noninvasive tool to detect biomarker in several cancers including lung cancer. Applications involving molecular analysis of ctDNA in lung cancer have increased and encompass diagnosis, response to treatment, acquired resistance and prognosis prediction, while bypassing the problem of tumor heterogeneity. ctDNA may then help perform dynamic genetic surveillance in the era of precision medicine through indirect tumoral genomic information determination. The aims of this review were to examine the recent technical developments that allowed the detection of genetic alterations of ctDNA in lung cancer. Furthermore, we explored clinical applications in patients with lung cancer including treatment efficiency monitoring, acquired therapy resistance mechanisms and prognosis value. PMID:28146051

  4. Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.

    PubMed

    Hokamp, Karsten

    2015-01-01

    Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol.

  5. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  6. SigTree: A Microbial Community Analysis Tool to Identify and Visualize Significantly Responsive Branches in a Phylogenetic Tree.

    PubMed

    Stevens, John R; Jones, Todd R; Lefevre, Michael; Ganesan, Balasubramanian; Weimer, Bart C

    2017-01-01

    Microbial community analysis experiments to assess the effect of a treatment intervention (or environmental change) on the relative abundance levels of multiple related microbial species (or operational taxonomic units) simultaneously using high throughput genomics are becoming increasingly common. Within the framework of the evolutionary phylogeny of all species considered in the experiment, this translates to a statistical need to identify the phylogenetic branches that exhibit a significant consensus response (in terms of operational taxonomic unit abundance) to the intervention. We present the R software package SigTree , a collection of flexible tools that make use of meta-analysis methods and regular expressions to identify and visualize significantly responsive branches in a phylogenetic tree, while appropriately adjusting for multiple comparisons.

  7. Application of Warranties in the Procurement of Spare Parts at the Navy Field Contracting System Level

    DTIC Science & Technology

    1987-06-01

    PROJECT TASK WORK .;NIT ELEMENT NO NO NO ACCESS OP. NO E I-c"lui~ de Securtiy CloI,#.caboriJ APPLICATION OF WARRANTIES IN THE PROCUREMENT OF SPARE...become common place in the United States and throughout the world. Due to the competitive nature of the market , buyers receive warranty coverage regardless...product warranties had primarily been viewed by manufacturers as a marketing tool. [Ref. 7: p. 1-41 However, the increased economic significance of

  8. Continuous monitoring of blood pressure in children and adolescents,a review of the literature.

    PubMed

    Mercado, Arlene B

    2008-08-01

    Continuous or ambulatory blood pressure monitoring (CBPM or ABPM) is becoming a useful tool in the early detection of hypertension in children and adolescents. With increased obesity in pediatrics, chronic diseases such as hypertension, diabetes, dyslipidemia and metabolic syndrome which was more commonly seen in adults in the early years, can now be seen in this population. This review provides the clinical reports of the use of CBPM for diagnosis and management of hypertension in the pediatric population.

  9. Can We Afford Not to Evaluate Services for Elderly Persons with Dementia?

    PubMed Central

    Worrall, Graham; Chambers, Larry W.

    1989-01-01

    With the increasing expenditure on health-care programs for seniors, there is an urgent need to evaluate such programs. The Measurement Iterative Loop is a tool that can provide both health administrators and health researchers with a method of evaluation of existing programs and identification of gaps in knowledge, and forms a rational basis for health-care policy decisions. In this article, the Loop is applied to one common problem of the elderly: dementia. PMID:21248993

  10. Surgical research using national databases

    PubMed Central

    Leland, Hyuma; Heckmann, Nathanael

    2016-01-01

    Recent changes in healthcare and advances in technology have increased the use of large-volume national databases in surgical research. These databases have been used to develop perioperative risk stratification tools, assess postoperative complications, calculate costs, and investigate numerous other topics across multiple surgical specialties. The results of these studies contain variable information but are subject to unique limitations. The use of large-volume national databases is increasing in popularity, and thorough understanding of these databases will allow for a more sophisticated and better educated interpretation of studies that utilize such databases. This review will highlight the composition, strengths, and weaknesses of commonly used national databases in surgical research. PMID:27867945

  11. Surgical research using national databases.

    PubMed

    Alluri, Ram K; Leland, Hyuma; Heckmann, Nathanael

    2016-10-01

    Recent changes in healthcare and advances in technology have increased the use of large-volume national databases in surgical research. These databases have been used to develop perioperative risk stratification tools, assess postoperative complications, calculate costs, and investigate numerous other topics across multiple surgical specialties. The results of these studies contain variable information but are subject to unique limitations. The use of large-volume national databases is increasing in popularity, and thorough understanding of these databases will allow for a more sophisticated and better educated interpretation of studies that utilize such databases. This review will highlight the composition, strengths, and weaknesses of commonly used national databases in surgical research.

  12. "Epidemiological criminology": coming full circle.

    PubMed

    Akers, Timothy A; Lanier, Mark M

    2009-03-01

    Members of the public health and criminal justice disciplines often work with marginalized populations: people at high risk of drug use, health problems, incarceration, and other difficulties. As these fields increasingly overlap, distinctions between them are blurred, as numerous research reports and funding trends document. However, explicit theoretical and methodological linkages between the 2 disciplines remain rare. A new paradigm that links methods and statistical models of public health with those of their criminal justice counterparts is needed, as are increased linkages between epidemiological analogies, theories, and models and the corresponding tools of criminology. We outline disciplinary commonalities and distinctions, present policy examples that integrate similarities, and propose "epidemiological criminology" as a bridging framework.

  13. Hybrid Platforms, Tools, and Resources

    ERIC Educational Resources Information Center

    Linder, Kathryn E.; Bruenjes, Linda S.; Smith, Sarah A.

    2017-01-01

    This chapter discusses common tools and resources for building a hybrid course in a higher education setting and provides recommendations for best practices in Learning Management Systems and Open Educational Resources.

  14. Designing an Exploratory Text Analysis Tool for Humanities and Social Sciences Research

    ERIC Educational Resources Information Center

    Shrikumar, Aditi

    2013-01-01

    This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands…

  15. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  16. Workshop on Aeronautical Decision Making (ADM). Volume 2. Plenary Session with Presentations and Proposed Action Plan

    DTIC Science & Technology

    1992-08-01

    programs have several common functional components dealing with: attention , crew, stress, mental attitude, and risk issues. The role which the five...five interrelated concept areas furnish "rules and tools" to help prevent common errors. For instance: 1. Attention management issues include...pilots must manage his/her attention in a timely manor and sequentially employ the other cockpit management tools (for controlling stress etc.). The text

  17. WE-G-BRC-02: Risk Assessment for HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayadev, J.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  18. WE-G-BRC-01: Risk Assessment for Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, G.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  19. WE-G-BRC-03: Risk Assessment for Physics Plan Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, S.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  20. Development of an Electronic Medical Record-Based Clinical Decision Support Tool to Improve HIV Symptom Management

    PubMed Central

    Tsevat, Joel; Justice, Amy C.; Mrus, Joseph M.; Levin, Forrest; Kozal, Michael J.; Mattocks, Kristin; Farber, Steven; Rogers, Michelle; Erdos, Joseph; Brandt, Cynthia; Kudel, Ian; Braithwaite, Ronald

    2009-01-01

    Abstract Common symptoms associated with HIV disease and its management are often underrecognized and undertreated. A clinical decision support tool for symptom management was developed within the Veterans Health Administration electronic medical record (EMR), aiming at increasing provider awareness of and response to common HIV symptoms. Its feasibility was studied in March to May 2007 by implementing it within a weekly HIV clinic, comparing a 4-week intervention period with a 4-week control period. Fifty-six patients and their providers participated in the study. Patients' perceptions of providers' awareness of their symptoms, proportion of progress notes mentioning any symptom(s) and proportion of care plans mentioning any symptom(s) were measured. The clinical decision support tool used portable electronic “tablets” to elicit symptom information at the time of check-in, filtered, and organized that information into a concise and clinically relevant EMR note available at the point of care, and facilitated clinical responses to that information. It appeared to be well accepted by patients and providers and did not substantially impact workflow. Although this pilot study was not powered to detect effectiveness, 25 (93%) patients in the intervention group reported that their providers were very aware of their symptoms versuas 27 (75%) control patients (p = 0.07). The proportion of providers' notes listing symptoms was similar in both periods; however, there was a trend toward including a greater number of symptoms in intervention period progress notes. The symptom support tool seemed to be useful in clinical HIV care. The Veterans Health Administration EMR may be an effective “laboratory” for developing and testing decision supports. PMID:19538046

  1. A typology of educationally focused medical simulation tools.

    PubMed

    Alinier, Guillaume

    2007-10-01

    The concept of simulation as an educational tool in healthcare is not a new idea but its use has really blossomed over the last few years. This enthusiasm is partly driven by an attempt to increase patient safety and also because the technology is becoming more affordable and advanced. Simulation is becoming more commonly used for initial training purposes as well as for continuing professional development, but people often have very different perceptions of the definition of the term simulation, especially in an educational context. This highlights the need for a clear classification of the technology available but also about the method and teaching approach employed. The aims of this paper are to discuss the current range of simulation approaches and propose a clear typology of simulation teaching aids. Commonly used simulation techniques have been identified and discussed in order to create a classification that reports simulation techniques, their usual mode of delivery, the skills they can address, the facilities required, their typical use, and their pros and cons. This paper presents a clear classification scheme of educational simulation tools and techniques with six different technological levels. They are respectively: written simulations, three-dimensional models, screen-based simulators, standardized patients, intermediate fidelity patient simulators, and interactive patient simulators. This typology allows the accurate description of the simulation technology and the teaching methods applied. Thus valid comparison of educational tools can be made as to their potential effectiveness and verisimilitude at different training stages. The proposed typology of simulation methodologies available for educational purposes provides a helpful guide for educators and participants which should help them to realise the potential learning outcomes at different technological simulation levels in relation to the training approach employed. It should also be a useful resource for simulation users who are trying to improve their educational practice.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  3. Benchmarking Tool Kit.

    ERIC Educational Resources Information Center

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  4. Competency assessment tools: An exploration of the pedagogical issues facing competency assessment for nurses in the clinical environment.

    PubMed

    Franklin, Natasha; Melville, Paula

    2015-01-01

    Competency assessment is a paradigm that is common in the healthcare environment and this being particularly true within the nursing profession. Demonstration of competence is necessary to meet the requirements of healthcare organisations and is a mandated requirement of nurses by the Nursing and Midwifery Board of Australia. Within the nursing education sector, one approach to determine competence, is through the use of competency assessment tools. Despite widespread use of competency assessment tools there remains ongoing concerns relating to the efficacy of competency assessment tools as a mean to demonstrate 'competency' amongst enrolled and registered nurses in the clinical environment. The authors of this paper ascertain that competency assessment tools run a serious risk of being nothing more than a 'quick-fix' means of assessment to demonstrate 'nursing competence' required for key performance indicators and clinical governance and that will provide evidence for accreditation standards. Based on this premise, the authors, provide an alternative approach to the use of competency assessment tools that moves away from a 'tick-box' approach to a 'patient-centred' competency model. This approach increases the reliability and validity of competency assessments, allows for the recognition of the knowledge, skills and experience of individual nurses, offers a more satisfying and rewarding approach to demonstrating 'competency' for nurses and finally, demonstrates 'real-life' competency.

  5. Optimizing ATLAS code with different profilers

    NASA Astrophysics Data System (ADS)

    Kama, S.; Seuster, R.; Stewart, G. A.; Vitillo, R. A.

    2014-06-01

    After the current maintenance period, the LHC will provide higher energy collisions with increased luminosity. In order to keep up with these higher rates, ATLAS software needs to speed up substantially. However, ATLAS code is composed of approximately 6M lines, written by many different programmers with different backgrounds, which makes code optimisation a challenge. To help with this effort different profiling tools and techniques are being used. These include well known tools, such as the Valgrind suite and Intel Amplifier; less common tools like Pin, PAPI, and GOoDA; as well as techniques such as library interposing. In this paper we will mainly focus on Pin tools and GOoDA. Pin is a dynamic binary instrumentation tool which can obtain statistics such as call counts, instruction counts and interrogate functions' arguments. It has been used to obtain CLHEP Matrix profiles, operations and vector sizes for linear algebra calculations which has provided the insight necessary to achieve significant performance improvements. Complimenting this, GOoDA, an in-house performance tool built in collaboration with Google, which is based on hardware performance monitoring unit events, is used to identify hot-spots in the code for different types of hardware limitations, such as CPU resources, caches, or memory bandwidth. GOoDA has been used in improvement of the performance of new magnetic field code and identification of potential vectorization targets in several places, such as Runge-Kutta propagation code.

  6. Knowledge representation for commonality

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1990-01-01

    Domain-specific knowledge necessary for commonality analysis falls into two general classes: commonality constraints and costing information. Notations for encoding such knowledge should be powerful and flexible and should appeal to the domain expert. The notations employed by the Commonality Analysis Problem Solver (CAPS) analysis tool are described. Examples are given to illustrate the main concepts.

  7. Tools for Data Analysis in the Middle School Classroom: A Teacher Professional Development Program

    NASA Astrophysics Data System (ADS)

    Ledley, T. S.; Haddad, N.; McAuliffe, C.; Dahlman, L.

    2006-12-01

    In order for students to learn how to engage with scientific data to answer questions about the real world, it is imperative that their teachers are 1) comfortable with the data and the tools used to analyze it, and 2) feel prepared to support their students in this complex endeavor. TERC's Tools for Data Analysis in the Middle School Classroom (DataTools) professional development program, funded by NSF's ITEST program, prepares middle school teachers to integrate Web-based scientific data and analysis tools into their existing curricula. This 13-month program supports teachers in using a set of freely or commonly available tools with a wide range of data. It also gives them an opportunity to practice teaching these skills to students before teaching in their own classrooms. The ultimate goal of the program is to increase the number of middle school students who work directly with scientific data, who use the tools of technology to import, manipulate, visualize and analyze the data, who come to understand the power of data-based arguments, and who will consider pursuing a career in technical and scientific fields. In this session, we will describe the elements of the DataTools program and the Earth Exploration Toolbook (EET, http://serc.carleton.edu/eet), a Web-based resource that supports Earth system education for teachers and students in grades 6 through 16. The EET provides essential support to DataTools teachers as they use it to learn to locate and download Web-based data and use data analysis tools. We will also share what we have learned during the first year of this three-year program.

  8. Common errors in multidrug-resistant tuberculosis management.

    PubMed

    Monedero, Ignacio; Caminero, Jose A

    2014-02-01

    Multidrug-resistant tuberculosis (MDR-TB), defined as being resistant to at least rifampicin and isoniazid, has an increasing burden and threatens TB control. Diagnosis is limited and usually delayed while treatment is long lasting, toxic and poorly effective. MDR-TB management in scarce-resource settings is demanding however it is feasible and extremely necessary. In these settings, cure rates do not usually exceed 60-70% and MDR-TB management is novel for many TB programs. In this challenging scenario, both clinical and programmatic errors are likely to occur. The majority of these errors may be prevented or alleviated with appropriate and timely training in addition to uninterrupted procurement of high-quality drugs, updated national guidelines and laws and an overall improvement in management capacities. While new tools for diagnosis and shorter and less toxic treatment are not available in developing countries, MDR-TB management will remain complex in scarce resource settings. Focusing special attention on the common errors in diagnosis, regimen design and especially treatment delivery may benefit patients and programs with current outdated tools. The present article is a compilation of typical errors repeatedly observed by the authors in a wide range of countries during technical assistant missions and trainings.

  9. Long-wave plasma radiofrequency ablation for treatment of xanthelasma palpebrarum.

    PubMed

    Baroni, Adone

    2018-03-01

    Xanthelasma palpebrarum is the most common type of xanthoma affecting the eyelids. It is characterized by asymptomatic soft yellowish macules, papules, or plaques over the upper and lower eyelids. Many treatments are available for management of xanthelasma palpebrarum, the most commonly used include surgical excision, ablative CO 2 or erbium lasers, nonablative Q-switched Nd:YAG laser, trichloroacetic acid peeling, and radiofrequency ablation. This study aims to evaluate the effectiveness of RF ablation in the treatment of xanthelasma palpebrarum, with D.A.S. Medical portable device (Technolux, Italia), a radiofrequency tool working with long-wave plasma energy and without anesthesia. Twenty patients, 15 female and 5 male, affected by xanthelasma palpebrarum, were enrolled for long-wave plasma radiofrequency ablation treatment. The treatment consisted of 3/4 sessions that were carried out at intervals of 30 days. Treatments were well tolerated by all patients with no adverse effects and optimal aesthetic results. The procedure is very fast and can be performed without anesthesia because of the low and tolerable pain stimulation. Long-wave plasma radiofrequency ablation is an effective option for treatment of xanthelasma palpebrarum and adds an additional tool to the increasing list of medical devices for aesthetic treatments. © 2018 Wiley Periodicals, Inc.

  10. Success of commonly used operating room management tools in reducing tardiness of first case of the day starts: evidence from German hospitals.

    PubMed

    Ernst, Christian; Szczesny, Andrea; Soderstrom, Naomi; Siegmund, Frank; Schleppers, Alexander

    2012-09-01

    One of the declared objectives of surgical suite management in Germany is to increase operating room (OR) efficiency by reducing tardiness of first case of the day starts. We analyzed whether the introduction of OR management tools by German hospitals in response to increasing economic pressure was successful in achieving this objective. The OR management tools we considered were the appointment of an OR manager and the development and adoption of a surgical suite governance document (OR charter). We hypothesized that tardiness of first case starts was less in ORs that have adopted one or both of these tools. Using representative 2005 survey data from 107 German anesthesiology departments, we used a Tobit model to estimate the effect of the introduction of an OR manager or OR charter on tardiness of first case starts, while controlling for hospital size and surgical suite complexity. Adoption reduced tardiness of first case starts by at least 7 minutes (mean reduction 15 minutes, 95% confidence interval (CI): 7-22 minutes, P < 0.001). Reductions in tardiness of first case starts figure prominently the objectives of surgical suite management in Germany. Our results suggest that the appointment of an OR manager or the adoption of an OR charter support this objective. For short-term decision making on the day of surgery, this reduction in tardiness may have economic implications, because it reduced overutilized OR time.

  11. Development of a prototype commonality analysis tool for use in space programs

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1988-01-01

    A software tool to aid in performing commonality analyses, called Commonality Analysis Problem Solver (CAPS), was designed, and a prototype version (CAPS 1.0) was implemented and tested. The CAPS 1.0 runs in an MS-DOS or IBM PC-DOS environment. The CAPS is designed around a simple input language which provides a natural syntax for the description of feasibility constraints. It provides its users with the ability to load a database representing a set of design items, describe the feasibility constraints on items in that database, and do a comprehensive cost analysis to find the most economical substitution pattern.

  12. Drugging the PI3 Kinome: From Chemical Tools to Drugs in the Clinic

    PubMed Central

    Workman, Paul; Clarke, Paul A; Raynaud, Florence I; van Montfort, Rob LM

    2011-01-01

    The phosphatidylinositide 3-kinase (PI3K) pathway is very commonly activated in a wide range of human cancers and is a major driving force in oncogenesis. One of the class I lipid kinase members of the PI3K family, p110α, is probably the most commonly mutated kinase in the human genome. Alongside genetic, molecular biological and biochemical studies, chemical inhibitors have been extremely helpful tools in understanding the role of PI3K enzymes in signal transduction and downstream physiological and pathological processes, and also in validating PI3Ks as therapeutic targets. Although they have been valuable in the past, the early and still frequently employed inhibitors, wortmannin and LY294002, have significant limitations as chemical tools. Here, we discuss the case history of the discovery and properties of an increasingly used chemical probe, the pan-class I PI3K and mTOR inhibitor PI-103 (a pyridofuropyrimidine) and its very recent evolution into the thienopyrimidine drug GDC-0941 that exhibits excellent oral anticancer activity in preclinical models and is now undergoing Phase I clinical trials in cancer patients. We also illustrate the impact of structural biology on the design of PI3K inhibitors and on the interpretation of their effects. The challenges and outlook for drugging the PI3 kinome are discussed in the more general context of the role of structural biology and chemical biology in innovative drug discovery. PMID:20179189

  13. Model Development for EHR Interdisciplinary Information Exchange of ICU Common Goals

    PubMed Central

    Collins, Sarah A.; Bakken, Suzanne; Vawdrey, David K.; Coiera, Enrico; Currie, Leanne

    2010-01-01

    Purpose Effective interdisciplinary exchange of patient information is an essential component of safe, efficient, and patient–centered care in the intensive care unit (ICU). Frequent handoffs of patient care, high acuity of patient illness, and the increasing amount of available data complicate information exchange. Verbal communication can be affected by interruptions and time limitations. To supplement verbal communication, many ICUs rely on documentation in electronic health records (EHRs) to reduce errors of omission and information loss. The purpose of this study was to develop a model of EHR interdisciplinary information exchange of ICU common goals. Methods The theoretical frameworks of distributed cognition and the clinical communication space were integrated and a previously published categorization of verbal information exchange was used. 59.5 hours of interdisciplinary rounds in a Neurovascular ICU were observed and five interviews and one focus group with ICU nurses and physicians were conducted. Results Current documentation tools in the ICU were not sufficient to capture the nurses' and physicians' collaborative decision-making and verbal communication of goal-directed actions and interactions. Clinicians perceived the EHR to be inefficient for information retrieval, leading to a further reliance on verbal information exchange. Conclusion The model suggests that EHRs should support: 1) Information tools for the explicit documentation of goals, interventions, and assessments with synthesized and summarized information outputs of events and updates; and 2) Messaging tools that support collaborative decision-making and patient safety double checks that currently occur between nurses and physicians in the absence of EHR support. PMID:20974549

  14. Supplementary insurance as a switching cost for basic health insurance: Empirical results from the Netherlands.

    PubMed

    Willemse-Duijmelinck, Daniëlle M I D; van de Ven, Wynand P M M; Mosca, Ilaria

    2017-10-01

    Nearly everyone with a supplementary insurance (SI) in the Netherlands takes out the voluntary SI and the mandatory basic insurance (BI) from the same health insurer. Previous studies show that many high-risks perceive SI as a switching cost for BI. Because consumers' current insurer provides them with a guaranteed renewability, SI is a switching cost if insurers apply selective underwriting to new applicants. Several changes in the Dutch health insurance market increased insurers' incentives to counteract adverse selection for SI. Tools to do so are not only selective underwriting, but also risk rating and product differentiation. If all insurers use the latter tools without selective underwriting, SI is not a switching cost for BI. We investigated to what extent insurers used these tools in the periods 2006-2009 and 2014-2015. Only a few insurers applied selective underwriting: in 2015, 86% of insurers used open enrolment for all their SI products, and the other 14% did use open enrolment for their most common SI products. As measured by our indicators, the proportion of insurers applying risk rating or product differentiation did not increase in the periods considered. Due to the fear of reputation loss insurers may have used 'less visible' tools to counteract adverse selection that are indirect forms of risk rating and product differentiation and do not result in switching costs. So, although many high-risks perceive SI as a switching cost, most insurers apply open enrolment for SI. By providing information to high-risks about their switching opportunities, the government could increase consumer choice and thereby insurers' incentives to invest in high-quality care for high-risks. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Pressing needs of biomedical text mining in biocuration and beyond: opportunities and challenges

    PubMed Central

    Singhal, Ayush; Leaman, Robert; Catlett, Natalie; Lemberger, Thomas; McEntyre, Johanna; Polson, Shawn; Xenarios, Ioannis; Arighi, Cecilia; Lu, Zhiyong

    2016-01-01

    Text mining in the biomedical sciences is rapidly transitioning from small-scale evaluation to large-scale application. In this article, we argue that text-mining technologies have become essential tools in real-world biomedical research. We describe four large scale applications of text mining, as showcased during a recent panel discussion at the BioCreative V Challenge Workshop. We draw on these applications as case studies to characterize common requirements for successfully applying text-mining techniques to practical biocuration needs. We note that system ‘accuracy’ remains a challenge and identify several additional common difficulties and potential research directions including (i) the ‘scalability’ issue due to the increasing need of mining information from millions of full-text articles, (ii) the ‘interoperability’ issue of integrating various text-mining systems into existing curation workflows and (iii) the ‘reusability’ issue on the difficulty of applying trained systems to text genres that are not seen previously during development. We then describe related efforts within the text-mining community, with a special focus on the BioCreative series of challenge workshops. We believe that focusing on the near-term challenges identified in this work will amplify the opportunities afforded by the continued adoption of text-mining tools. Finally, in order to sustain the curation ecosystem and have text-mining systems adopted for practical benefits, we call for increased collaboration between text-mining researchers and various stakeholders, including researchers, publishers and biocurators. PMID:28025348

  16. Pressing needs of biomedical text mining in biocuration and beyond: opportunities and challenges

    DOE PAGES

    Singhal, Ayush; Leaman, Robert; Catlett, Natalie; ...

    2016-12-26

    Text mining in the biomedical sciences is rapidly transitioning from small-scale evaluation to large-scale application. In this article, we argue that text-mining technologies have become essential tools in real-world biomedical research. We describe four large scale applications of text mining, as showcased during a recent panel discussion at the BioCreative V Challenge Workshop. We draw on these applications as case studies to characterize common requirements for successfully applying text-mining techniques to practical biocuration needs. We note that system ‘accuracy’ remains a challenge and identify several additional common difficulties and potential research directions including (i) the ‘scalability’ issue due to themore » increasing need of mining information from millions of full-text articles, (ii) the ‘interoperability’ issue of integrating various text-mining systems into existing curation workflows and (iii) the ‘reusability’ issue on the difficulty of applying trained systems to text genres that are not seen previously during development. We then describe related efforts within the text-mining community, with a special focus on the BioCreative series of challenge workshops. We believe that focusing on the near-term challenges identified in this work will amplify the opportunities afforded by the continued adoption of text-mining tools. In conclusion, in order to sustain the curation ecosystem and have text-mining systems adopted for practical benefits, we call for increased collaboration between text-mining researchers and various stakeholders, including researchers, publishers and biocurators.« less

  17. Pressing needs of biomedical text mining in biocuration and beyond: opportunities and challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singhal, Ayush; Leaman, Robert; Catlett, Natalie

    Text mining in the biomedical sciences is rapidly transitioning from small-scale evaluation to large-scale application. In this article, we argue that text-mining technologies have become essential tools in real-world biomedical research. We describe four large scale applications of text mining, as showcased during a recent panel discussion at the BioCreative V Challenge Workshop. We draw on these applications as case studies to characterize common requirements for successfully applying text-mining techniques to practical biocuration needs. We note that system ‘accuracy’ remains a challenge and identify several additional common difficulties and potential research directions including (i) the ‘scalability’ issue due to themore » increasing need of mining information from millions of full-text articles, (ii) the ‘interoperability’ issue of integrating various text-mining systems into existing curation workflows and (iii) the ‘reusability’ issue on the difficulty of applying trained systems to text genres that are not seen previously during development. We then describe related efforts within the text-mining community, with a special focus on the BioCreative series of challenge workshops. We believe that focusing on the near-term challenges identified in this work will amplify the opportunities afforded by the continued adoption of text-mining tools. In conclusion, in order to sustain the curation ecosystem and have text-mining systems adopted for practical benefits, we call for increased collaboration between text-mining researchers and various stakeholders, including researchers, publishers and biocurators.« less

  18. What Lies Beneath? An Evaluation of Rapid Assessment Tools for Management of Hull Fouling

    NASA Astrophysics Data System (ADS)

    Clarke Murray, Cathryn; Therriault, Thomas W.; Pakhomov, Evgeny

    2013-08-01

    Despite an increased understanding of marine invasions, non-indigenous species (NIS) continue to be redistributed at both global and regional scales. Since prevention is an important element of NIS programs, monitoring vectors responsible for NIS introductions and spread, such as hull fouling, has become a priority and methods should be selected carefully to balance accuracy, time, and cost. Two common fouling assessment tools for the marine recreational boating vector were evaluated for accuracy using a traditional underwater SCUBA survey in coastal British Columbia: a dockside level of fouling assessment and a behavioral questionnaire model. Results showed that although rapid, dockside assessments did not provide an accurate assessment of fouling present below the surface, at least not in this region. In contrast, a questionnaire-based model using four easily obtained variables (boat type, age of antifouling paint, storage type, and occurrence of long distance trips) reliably identified boats carrying macrofouling species, a proxy for risk of NIS transport. Once validated, this fouling model tool could be applied in border inspection or quarantine situations where decisions must be made quickly. Further development and refinement of rapid assessment tools would improve our ability to prevent new introductions and manage spread of existing invasive species.

  19. What lies beneath? An evaluation of rapid assessment tools for management of hull fouling.

    PubMed

    Clarke Murray, Cathryn; Therriault, Thomas W; Pakhomov, Evgeny

    2013-08-01

    Despite an increased understanding of marine invasions, non-indigenous species (NIS) continue to be redistributed at both global and regional scales. Since prevention is an important element of NIS programs, monitoring vectors responsible for NIS introductions and spread, such as hull fouling, has become a priority and methods should be selected carefully to balance accuracy, time, and cost. Two common fouling assessment tools for the marine recreational boating vector were evaluated for accuracy using a traditional underwater SCUBA survey in coastal British Columbia: a dockside level of fouling assessment and a behavioral questionnaire model. Results showed that although rapid, dockside assessments did not provide an accurate assessment of fouling present below the surface, at least not in this region. In contrast, a questionnaire-based model using four easily obtained variables (boat type, age of antifouling paint, storage type, and occurrence of long distance trips) reliably identified boats carrying macrofouling species, a proxy for risk of NIS transport. Once validated, this fouling model tool could be applied in border inspection or quarantine situations where decisions must be made quickly. Further development and refinement of rapid assessment tools would improve our ability to prevent new introductions and manage spread of existing invasive species.

  20. Can surgical simulation be used to train detection and classification of neural networks?

    PubMed

    Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail

    2017-10-01

    Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.

  1. Brazed Diamond Micropowder Bur Fabricated by Supersonic Frequency Induction Heating for Precision Machining

    NASA Astrophysics Data System (ADS)

    Ma, Bojiang; Lou, Jianpeng; Pang, Qian

    2014-04-01

    The common brazed diamond micropowder bur fabricated in a vacuum furnace produces an even brazing alloy surface. The small brazed diamond grits show low outcropping from the brazing alloy surface, and the chip space between them is small. The bur shows a low grinding efficiency and poor heat dissipation. In this study, a brazed diamond micropowder bur was fabricated by supersonic frequency induction heating. The method afforded a fluctuant surface on the brazing alloy. The brazed diamond grits with an outcropping height distributed uniformly on the fluctuant surface. The fluctuant surface showed a certain chip space. These characteristics of the tool increased the grinding efficiency and decreased the temperature of the grinding arc area. The roughness R a of the ceramic tile surface trimmed by the tool cylinder was between 0.09 and 0.12 μm. In the first 90 min, the decrease in the weight of the ceramic tile ground by the tool cylinder was higher than that ground by the tool fabricated in a vacuum furnace. When the ceramic tile was cylindrically ground, the temperature of the grinding arc area measured using a thermocouple remained below 70 °C.

  2. New tuberculosis technologies: challenges for retooling and scale-up.

    PubMed

    Pai, M; Palamountain, K M

    2012-10-01

    The availability of new tools does not mean that they will be adopted, used correctly, scaled up or have public health impact. Experience to date with new diagnostics suggests that many national tuberculosis programmes (NTPs) in high-burden countries are reluctant to adopt and scale up new tools, even when these are backed by evidence and global policy recommendations. We suggest that there are several common barriers to effective national adoption and scale-up of new technologies: global policy recommendations that do not provide sufficient information for scale-up, complex decision-making processes and weak political commitment at the country level, limited engagement of and support to NTP managers, high cost of tools and poor fit with user needs, unregulated markets and inadequate business models, limited capacity for laboratory strengthening and implementation research, and insufficient advocacy and donor support. Overcoming these barriers will require enhanced country-level advocacy, resources, technical assistance and political commitment. Some of the BRICS (Brazil, Russia, India, China, South Africa) countries are emerging as early adopters of policies and technologies, and are increasing their investments in TB control. They may provide the first opportunities to fully assess the public health impact of new tools.

  3. Knowledge translation regarding financial abuse and dementia for the banking sector: the development and testing of an education tool.

    PubMed

    Peisah, Carmelle; Bhatia, Sangita; Macnab, Jenna; Brodaty, Henry

    2016-07-01

    Financial abuse is the most common form of elder abuse. Capacity Australia, established to promote education regarding capacity and abuse prevention across health, legal and financial sectors, was awarded a grant by the Dementia Collaborative Research Centre to educate the banking sector on financial abuse and dementia. We aimed to develop a knowledge translation tool for bank staff on this issue. The banking sector across Australia was engaged and consulted to develop a tailored education tool based on Australian Banking Association's Guidelines on Financial Abuse Prevention, supplemented by information related to dementia, financial capacity and supported decision-making. The tool was tested on 69 banking staff across Australia from two major banks. An online education tool using adaptive learning was developed, comprising a pretest of 15 multiple choice questions, followed by a learning module tailored to the individual's performance on the pretest, and a post-test to assess knowledge translation. A significant increase in scores was demonstrated when baseline scores were compared with post-course scores (mean difference in scores = 3.5; SD = 1.94; t = 15.1; df = 68; p < 0.001). The tool took approximately 10-20 min to complete depending on the knowledge of participant and continuity of completion. The Australian banking industry was amenable to assist in the development of a tailored education tool on dementia, abuse and financial capacity. This online e-tool provides an effective medium for knowledge translation. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Uterine Cancer Statistics

    MedlinePlus

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  5. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  6. Critical Appraisal Tools and Reporting Guidelines for Evidence-Based Practice.

    PubMed

    Buccheri, Robin K; Sharifi, Claire

    2017-12-01

    Nurses engaged in evidence-based practice (EBP) have two important sets of tools: Critical appraisal tools and reporting guidelines. Critical appraisal tools facilitate the appraisal process and guide a consumer of evidence through an objective, analytical, evaluation process. Reporting guidelines, checklists of items that should be included in a publication or report, ensure that the project or guidelines are reported on with clarity, completeness, and transparency. The primary purpose of this paper is to help nurses understand the difference between critical appraisal tools and reporting guidelines. A secondary purpose is to help nurses locate the appropriate tool for the appraisal or reporting of evidence. A systematic search was conducted to find commonly used critical appraisal tools and reporting guidelines for EBP in nursing. This article serves as a resource to help nurse navigate the often-overwhelming terrain of critical appraisal tools and reporting guidelines, and will help both novice and experienced consumers of evidence more easily select the appropriate tool(s) to use for critical appraisal and reporting of evidence. Having the skills to select the appropriate tool or guideline is an essential part of meeting EBP competencies for both practicing registered nurses and advanced practice nurses (Melnyk & Gallagher-Ford, 2015; Melnyk, Gallagher-Ford, & Fineout-Overholt, 2017). Nine commonly used critical appraisal tools and eight reporting guidelines were found and are described in this manuscript. Specific steps for selecting an appropriate tool as well as examples of each tool's use in a publication are provided. Practicing registered nurses and advance practice nurses must be able to critically appraise and disseminate evidence in order to meet EBP competencies. This article is a resource for understanding the difference between critical appraisal tools and reporting guidelines, and identifying and accessing appropriate tools or guidelines. © 2017 Sigma Theta Tau International.

  7. Toward Scholarship in Practice

    ERIC Educational Resources Information Center

    Singer-Gabella, Marcy

    2012-01-01

    Background/Context: Over the past decade, scholars of teaching and teacher education have concluded that the field lacks a common conceptual vocabulary to undergird systematic investigation of practice. Absent a shared language, we can neither articulate common questions nor establish common tools--essential elements for building knowledge and…

  8. Prostate cancer: predicting high-risk prostate cancer-a novel stratification tool.

    PubMed

    Buck, Jessica; Chughtai, Bilal

    2014-05-01

    Currently, numerous systems exist for the identification of high-risk prostate cancer, but few of these systems can guide treatment strategies. A new stratification tool that uses common diagnostic factors can help to predict outcomes after radical prostatectomy. The tool aids physicians in the identification of appropriate candidates for aggressive, local treatment.

  9. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  10. Context and hand posture modulate the neural dynamics of tool-object perception.

    PubMed

    Natraj, Nikhilesh; Poole, Victoria; Mizelle, J C; Flumini, Andrea; Borghi, Anna M; Wheaton, Lewis A

    2013-02-01

    Prior research has linked visual perception of tools with plausible motor strategies. Thus, observing a tool activates the putative action-stream, including the left posterior parietal cortex. Observing a hand functionally grasping a tool involves the inferior frontal cortex. However, tool-use movements are performed in a contextual and grasp specific manner, rather than relative isolation. Our prior behavioral data has demonstrated that the context of tool-use (by pairing the tool with different objects) and varying hand grasp postures of the tool can interact to modulate subjects' reaction times while evaluating tool-object content. Specifically, perceptual judgment was delayed in the evaluation of functional tool-object pairings (Correct context) when the tool was non-functionally (Manipulative) grasped. Here, we hypothesized that this behavioral interference seen with the Manipulative posture would be due to increased and extended left parietofrontal activity possibly underlying motor simulations when resolving action conflict due to this particular grasp at time scales relevant to the behavioral data. Further, we hypothesized that this neural effect will be restricted to the Correct tool-object context wherein action affordances are at a maximum. 64-channel electroencephalography (EEG) was recorded from 16 right-handed subjects while viewing images depicting three classes of tool-object contexts: functionally Correct (e.g. coffee pot-coffee mug), functionally Incorrect (e.g. coffee pot-marker) and Spatial (coffee pot-milk). The Spatial context pairs a tool and object that would not functionally match, but may commonly appear in the same scene. These three contexts were modified by hand interaction: No Hand, Static Hand near the tool, Functional Hand posture and Manipulative Hand posture. The Manipulative posture is convenient for relocating a tool but does not afford a functional engagement of the tool on the target object. Subjects were instructed to visually assess whether the pictures displayed correct tool-object associations. EEG data was analyzed in time-voltage and time-frequency domains. Overall, Static Hand, Functional and Manipulative postures cause early activation (100-400ms post image onset) of parietofrontal areas, to varying intensity in each context, when compared to the No Hand control condition. However, when context is Correct, only the Manipulative Posture significantly induces extended neural responses, predominantly over right parietal and right frontal areas [400-600ms post image onset]. Significant power increase was observed in the theta band [4-8Hz] over the right frontal area, [0-500ms]. In addition, when context is Spatial, Manipulative posture alone significantly induces extended neural responses, over bilateral parietofrontal and left motor areas [400-600ms]. Significant power decrease occurred primarily in beta bands [12-16, 20-25Hz] over the aforementioned brain areas [400-600ms]. Here, we demonstrate that the neural processing of tool-object perception is sensitive to several factors. While both Functional and Manipulative postures in Correct context engage predominantly an early left parietofrontal circuit, the Manipulative posture alone extends the neural response and transitions to a late right parietofrontal network. This suggests engagement of a right neural system to evaluate action affordances when hand posture does not support action (Manipulative). Additionally, when tool-use context is ambiguous (Spatial context), there is increased bilateral parietofrontal activation and, extended neural response for the Manipulative posture. These results point to the existence of other networks evaluating tool-object associations when motoric affordances are not readily apparent and underlie corresponding delayed perceptual judgment in our prior behavioral data wherein Manipulative postures had exclusively interfered in judging tool-object content. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Commonly used stimulants: Sleep problems, dependence and psychological distress.

    PubMed

    Ogeil, Rowan P; Phillips, James G

    2015-08-01

    Caffeine and nicotine are commonly used stimulants that enhance alertness and mood. Discontinuation of both stimulants is associated with withdrawal symptoms including sleep and mood disturbances, which may differ in males and females. The present study examines changes in sleep quality, daytime sleepiness and psychological distress associated with use and dependence on caffeine and nicotine. An online survey comprising validated tools to assess sleep quality, excessive daytime sleepiness and psychological distress was completed by 166 participants (74 males, 96 females) with a mean age of 28 years. Participants completed the study in their own time, and were not offered any inducements to participate. Sleep quality was poorer in those dependent upon caffeine or nicotine, and there were also significant interaction effects with gender whereby females reported poorer sleep despite males reporting higher use of both stimulants. Caffeine dependence was associated with poorer sleep quality, increased daytime dysfunction, and increased levels of night time disturbance, while nicotine dependence was associated with poorer sleep quality and increased use of sleep medication and sleep disturbances. There were strong links between poor sleep and diminished affect, with psychological distress found to co-occur in the context of disturbed sleep. Stimulants are widely used to promote vigilance and mood; however, dependence on commonly used drugs including caffeine and nicotine is associated with decrements in sleep quality and increased psychological distress, which may be compounded in female dependent users. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Axial strength test for round flat faced versus capsule shaped bilayer tablets.

    PubMed

    Franck, Jason; Abebe, Admassu; Keluskar, Rekha; Martin, Kyle; Majumdar, Antara; Kottala, Niranjan; Stamato, Howard

    2015-03-01

    There has been increasing interest in fixed dose combination (FDC) therapy. Multi-layer tablets are a popular choice among various technologies to deliver FDCs. In most cases, round flat faced tooling is used in testing tablets as they have the simplest geometry. However, shaped tooling is more common for commercial products and may have an effect on bilayer tablet strength. Capsule shaped bilayer tablets, similar to a commercial image, and holders conforming to the tablet topology, were compared with similar round flat faced bilayer tablets and their corresponding holders. Bilayer tablets were subjected to an axial test device, until fracture and the quantitative breaking force value was recorded. As the second layer compression force increases, regardless of holder design, an increase in breaking force occurs as expected. This consistent trend provides insight regarding the breaking force of capsule shaped bilayer tablets. The results of this study show that at lower second layer compression forces, tablet geometry does not significantly impact the results. However, at higher compression forces, a significant difference in breaking force between tablet geometries exists. Therefore, using a test geometry close to the final commercial tablet image is recommended to have the most accurate prediction for tablet breakage.

  13. Reviewing the adoption and impact of water markets in the Murray-Darling Basin, Australia

    NASA Astrophysics Data System (ADS)

    Wheeler, S.; Loch, A.; Zuo, A.; Bjornlund, H.

    2014-10-01

    Water markets have increasingly been adopted as a reallocation tool around the world as water scarcity intensifies. Water markets were first introduced in Australia in the 1980s, and water entitlement and allocation trade have been increasingly adopted by both private individuals and governments. As well as providing an overview of water policy in Australia since the 1900s, this paper examines the adoption of water trading in the southern Murray-Darling Basin of Australia (the largest hydrologically connected water market in Australia), and investigates the associated social, economic and environmental impacts that have arisen from the implementation of water markets. This study found that up to 86% of irrigators in one state in the southern Murray-Darling Basin had undertaken at least one water market trade by 2010-2011, hence, water market strategies are now a common tool employed by irrigators to assist their farm management. A variety of institutional, policy and informational changes are identified to increase the benefits from water markets in the future. There is no doubt that managing the impact of climate change and water scarcity are intertwined, suggesting that policy, institutional and governance responses should be similarly structured and coordinated.

  14. An approach to radiation safety department benchmarking in academic and medical facilities.

    PubMed

    Harvey, Richard P

    2015-02-01

    Based on anecdotal evidence and networking with colleagues at other facilities, it has become evident that some radiation safety departments are not adequately staffed and radiation safety professionals need to increase their staffing levels. Discussions with management regarding radiation safety department staffing often lead to similar conclusions. Management acknowledges the Radiation Safety Officer (RSO) or Director of Radiation Safety's concern but asks the RSO to provide benchmarking and justification for additional full-time equivalents (FTEs). The RSO must determine a method to benchmark and justify additional staffing needs while struggling to maintain a safe and compliant radiation safety program. Benchmarking and justification are extremely important tools that are commonly used to demonstrate the need for increased staffing in other disciplines and are tools that can be used by radiation safety professionals. Parameters that most RSOs would expect to be positive predictors of radiation safety staff size generally are and can be emphasized in benchmarking and justification report summaries. Facilities with large radiation safety departments tend to have large numbers of authorized users, be broad-scope programs, be subject to increased controls regulations, have large clinical operations, have significant numbers of academic radiation-producing machines, and have laser safety responsibilities.

  15. Modelling the occurrence of heat waves in maximum and minimum temperatures over Spain and projections for the period 2031-60

    NASA Astrophysics Data System (ADS)

    Abaurrea, J.; Asín, J.; Cebrián, A. C.

    2018-02-01

    The occurrence of extreme heat events in maximum and minimum daily temperatures is modelled using a non-homogeneous common Poisson shock process. It is applied to five Spanish locations, representative of the most common climates over the Iberian Peninsula. The model is based on an excess over threshold approach and distinguishes three types of extreme events: only in maximum temperature, only in minimum temperature and in both of them (simultaneous events). It takes into account the dependence between the occurrence of extreme events in both temperatures and its parameters are expressed as functions of time and temperature related covariates. The fitted models allow us to characterize the occurrence of extreme heat events and to compare their evolution in the different climates during the observed period. This model is also a useful tool for obtaining local projections of the occurrence rate of extreme heat events under climate change conditions, using the future downscaled temperature trajectories generated by Earth System Models. The projections for 2031-60 under scenarios RCP4.5, RCP6.0 and RCP8.5 are obtained and analysed using the trajectories from four earth system models which have successfully passed a preliminary control analysis. Different graphical tools and summary measures of the projected daily intensities are used to quantify the climate change on a local scale. A high increase in the occurrence of extreme heat events, mainly in July and August, is projected in all the locations, all types of event and in the three scenarios, although in 2051-60 the increase is higher under RCP8.5. However, relevant differences are found between the evolution in the different climates and the types of event, with a specially high increase in the simultaneous ones.

  16. Evaluating the Effect of a Web-Based E-Learning Tool for Health Professional Education on Clinical Vancomycin Use: Comparative Study.

    PubMed

    Bond, Stuart Evan; Crowther, Shelley P; Adhikari, Suman; Chubaty, Adriana J; Yu, Ping; Borchard, Jay P; Boutlis, Craig Steven; Yeo, Wilfred Winston; Miyakis, Spiros

    2018-02-26

    Internet-based learning for health professional education is increasing. It offers advantages over traditional learning approaches, as it enables learning to be completed at a time convenient to the user and improves access where facilities are geographically disparate. We developed and implemented the Vancomycin Interactive (VI) e-learning tool to improve knowledge on the clinical use of the antibiotic vancomycin, which is commonly used for treatment of infections caused by methicillin-resistant Staphylococcus aureus (MRSA). The aims of this study were to evaluate the effect of the VI e-learning tool on (1) survey knowledge scores and (2) clinical use of vancomycin among health professionals. We conducted a comparative pre-post intervention study across the 14 hospitals of two health districts in New South Wales, Australia. A knowledge survey was completed by nurses, doctors, and pharmacists before and after release of a Web-based e-learning tool. Survey scores were compared with those obtained following traditional education in the form of an email intervention. Survey questions related to dosing, administration, and monitoring of vancomycin. Outcome measures were survey knowledge scores among the three health professional groups, vancomycin plasma trough levels, and vancomycin approvals recorded on a computerized clinical decision support system. Survey response rates were low at 26.87% (577/2147) preintervention and 8.24% (177/2147) postintervention. The VI was associated with an increase in knowledge scores (maximum score=5) among nurses (median 2, IQR 1-2 to median 2, IQR 1-3; P<.001), but not among other professional groups. The comparator email intervention was associated with an increase in knowledge scores among doctors (median 3, IQR 2-4 to median 4, IQR 2-4; P=.04). Participants who referred to Web-based resources while completing the e-learning tool achieved higher overall scores than those who did not (P<.001). The e-learning tool was not shown to be significantly more effective than the comparator email in the clinical use of vancomycin, as measured by plasma levels within the therapeutic range. The e-learning tool was associated with improved knowledge scores among nurses, whereas the comparator email was associated with improved scores among doctors. This implies that different strategies may be required for optimizing the effectiveness of education among different health professional groups. Low survey response rates limited conclusions regarding the tool's effectiveness. Improvements to design and evaluation methodology may increase the likelihood of a demonstrable effect from e-learning tools in the future. ©Stuart Evan Bond, Shelley P Crowther, Suman Adhikari, Adriana J Chubaty, Ping Yu, Jay P Borchard, Craig Steven Boutlis, Wilfred Winston Yeo, Spiros Miyakis. Originally published in JMIR Medical Education (http://mededu.jmir.org), 26.02.2018.

  17. HiRel - Reliability/availability integrated workstation tool

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Dugan, Joanne B.

    1992-01-01

    The HiRel software tool is described and demonstrated by application to the mission avionics subsystem of the Advanced System Integration Demonstrations (ASID) system that utilizes the PAVE PILLAR approach. HiRel marks another accomplishment toward the goal of producing a totally integrated computer-aided design (CAD) workstation design capability. Since a reliability engineer generally represents a reliability model graphically before it can be solved, the use of a graphical input description language increases productivity and decreases the incidence of error. The graphical postprocessor module HARPO makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes. The addition of several powerful HARP modeling engines provides the user with a reliability/availability modeling capability for a wide range of system applications all integrated under a common interactive graphical input-output capability.

  18. Detecting Spatial Patterns in Biological Array Experiments

    PubMed Central

    ROOT, DAVID E.; KELLEY, BRIAN P.; STOCKWELL, BRENT R.

    2005-01-01

    Chemical genetic screening and DNA and protein microarrays are among a number of increasingly important and widely used biological research tools that involve large numbers of parallel experiments arranged in a spatial array. It is often difficult to ensure that uniform experimental conditions are present throughout the entire array, and as a result, one often observes systematic spatially correlated errors, especially when array experiments are performed using robots. Here, the authors apply techniques based on the discrete Fourier transform to identify and quantify spatially correlated errors superimposed on a spatially random background. They demonstrate that these techniques are effective in identifying common spatially systematic errors in high-throughput 384-well microplate assay data. In addition, the authors employ a statistical test to allow for automatic detection of such errors. Software tools for using this approach are provided. PMID:14567791

  19. Designing and encoding models for synthetic biology.

    PubMed

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-08-06

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology 'loop'.

  20. Investigation of the influence of process parameters on adhesive wear under hot stamping conditions

    NASA Astrophysics Data System (ADS)

    Schwingenschlögl, P.; Weldi, M.; Merklein, M.

    2017-09-01

    Current challenges like increasing safety standards and reducing fuel consumption motivate lightweight construction in modern car bodies. Besides using lightweight workpiece materials like aluminum, hot stamping has been established as a key technology for producing safety relevant components. Producing hot stamped parts out of ultra-high strength steels offers the possibility to improve the crash performance. At the same time the weight of car structure is reduced by using thinner sheet thicknesses. In order to avoid oxide scale formation and ensure corrosion protection, AlSi coatings are commonly deposited on the sheet surfaces used for direct hot stamping. This workpiece coating has a critical impact on the tribological conditions within the forming process and, as a consequence, influences the quality of hot stamped parts as well as tool wear. AlSi coatings have been identified as major reason for adhesive wear, which represents the main wear mechanism in hot stamping. Within this study, the influence of the process parameters on adhesive wear are investigated in dependency of workpiece and tool temperatures, drawing velocities and contact pressures. The tribological behavior is analyzed based on strip drawing experiments under direct hot stamping conditions. The experiments are performed with AlSi coated 22MnB5 in contact with the hot working tool steel 1.2367. For analyzing the amount of adhesion on the friction jaws, the surfaces are characterized by optical measurements. The experiments indicate that higher workpiece temperatures cause severe adhesive wear on the tool surface, while an increase of drawing velocity or contact pressure led to reduced adhesion. The measured friction coefficients decreased with rising amount of adhesion and remained at a constant level after a certain adhesive layer was built up on the tool surface.

  1. CMR Metadata Curation

    NASA Technical Reports Server (NTRS)

    Shum, Dana; Bugbee, Kaylin

    2017-01-01

    This talk explains the ongoing metadata curation activities in the Common Metadata Repository. It explores tools that exist today which are useful for building quality metadata and also opens up the floor for discussions on other potentially useful tools.

  2. SNP discovery in common bean by restriction-associated DNA (RAD) sequencing for genetic diversity and population structure analysis.

    PubMed

    Valdisser, Paula Arielle M R; Pappas, Georgios J; de Menezes, Ivandilson P P; Müller, Bárbara S F; Pereira, Wendell J; Narciso, Marcelo G; Brondani, Claudio; Souza, Thiago L P O; Borba, Tereza C O; Vianello, Rosana P

    2016-06-01

    Researchers have made great advances into the development and application of genomic approaches for common beans, creating opportunities to driving more real and applicable strategies for sustainable management of the genetic resource towards plant breeding. This work provides useful polymorphic single-nucleotide polymorphisms (SNPs) for high-throughput common bean genotyping developed by RAD (restriction site-associated DNA) sequencing. The RAD tags were generated from DNA pooled from 12 common bean genotypes, including breeding lines of different gene pools and market classes. The aligned sequences identified 23,748 putative RAD-SNPs, of which 3357 were adequate for genotyping; 1032 RAD-SNPs with the highest ADT (assay design tool) score are presented in this article. The RAD-SNPs were structurally annotated in different coding (47.00 %) and non-coding (53.00 %) sequence components of genes. A subset of 384 RAD-SNPs with broad genome distribution was used to genotype a diverse panel of 95 common bean germplasms and revealed a successful amplification rate of 96.6 %, showing 73 % of polymorphic SNPs within the Andean group and 83 % in the Mesoamerican group. A slightly increased He (0.161, n = 21) value was estimated for the Andean gene pool, compared to the Mesoamerican group (0.156, n = 74). For the linkage disequilibrium (LD) analysis, from a group of 580 SNPs (289 RAD-SNPs and 291 BARC-SNPs) genotyped for the same set of genotypes, 70.2 % were in LD, decreasing to 0.10 %in the Andean group and 0.77 % in the Mesoamerican group. Haplotype patterns spanning 310 Mb of the genome (60 %) were characterized in samples from different origins. However, the haplotype frameworks were under-represented for the Andean (7.85 %) and Mesoamerican (5.55 %) gene pools separately. In conclusion, RAD sequencing allowed the discovery of hundreds of useful SNPs for broad genetic analysis of common bean germplasm. From now, this approach provides an excellent panel of molecular tools for whole genome analysis, allowing integrating and better exploring the common bean breeding practices.

  3. Targeting obesity-related adipose tissue dysfunction to prevent cancer development and progression

    PubMed Central

    Gucalp, Ayca; Iyengar, Neil M.; Hudis, Clifford A.; Dannenberg, Andrew J.

    2016-01-01

    The incidence of obesity, a leading modifiable risk factor for common solid tumors, is increasing. Effective interventions are needed to minimize the public health implications of obesity. Although the mechanisms linking increased adiposity to malignancy are incompletely understood, growing evidence points to complex interactions among multiple systemic and tissue-specific pathways including inflamed white adipose tissue. The metabolic and inflammatory consequences of white adipose tissue dysfunction collectively provide a plausible explanation for the link between overweight/obesity and carcinogenesis. Gaining a better understanding of these underlying molecular pathways and developing risk assessment tools that identify at-risk populations will be critical in implementing effective and novel cancer prevention and management strategies. PMID:26970134

  4. “Epidemiological Criminology”: Coming Full Circle

    PubMed Central

    Lanier, Mark M.

    2009-01-01

    Members of the public health and criminal justice disciplines often work with marginalized populations: people at high risk of drug use, health problems, incarceration, and other difficulties. As these fields increasingly overlap, distinctions between them are blurred, as numerous research reports and funding trends document. However, explicit theoretical and methodological linkages between the 2 disciplines remain rare. A new paradigm that links methods and statistical models of public health with those of their criminal justice counterparts is needed, as are increased linkages between epidemiological analogies, theories, and models and the corresponding tools of criminology. We outline disciplinary commonalities and distinctions, present policy examples that integrate similarities, and propose “epidemiological criminology” as a bridging framework. PMID:19150901

  5. Schizophrenia and Depression Co-Morbidity: What We have Learned from Animal Models

    PubMed Central

    Samsom, James N.; Wong, Albert H. C.

    2015-01-01

    Patients with schizophrenia are at an increased risk for the development of depression. Overlap in the symptoms and genetic risk factors between the two disorders suggests a common etiological mechanism may underlie the presentation of comorbid depression in schizophrenia. Understanding these shared mechanisms will be important in informing the development of new treatments. Rodent models are powerful tools for understanding gene function as it relates to behavior. Examining rodent models relevant to both schizophrenia and depression reveals a number of common mechanisms. Current models which demonstrate endophenotypes of both schizophrenia and depression are reviewed here, including models of CUB and SUSHI multiple domains 1, PDZ and LIM domain 5, glutamate Delta 1 receptor, diabetic db/db mice, neuropeptide Y, disrupted in schizophrenia 1, and its interacting partners, reelin, maternal immune activation, and social isolation. Neurotransmission, brain connectivity, the immune system, the environment, and metabolism emerge as potential common mechanisms linking these models and potentially explaining comorbid depression in schizophrenia. PMID:25762938

  6. Ethics of Social Media Research: Common Concerns and Practical Considerations

    PubMed Central

    Goniu, Natalie; Moreno, Peter S.; Diekema, Douglas

    2013-01-01

    Abstract Social media Websites (SMWs) are increasingly popular research tools. These sites provide new opportunities for researchers, but raise new challenges for Institutional Review Boards (IRBs) that review these research protocols. As of yet, there is little-to-no guidance regarding how an IRB should review the studies involving SMWs. The purpose of this article was to review the common risks inherent in social media research and consider how researchers can consider these risks when writing research protocols. We focused this article on three common research approaches: observational research, interactive research, and survey/interview research. Concomitant with these research approaches, we gave particular attention to the issues pertinent to SMW research, including privacy, consent, and confidentiality. After considering these challenges, we outlined key considerations for both researchers and reviewers when creating or reviewing SMW IRB protocols. Our goal in this article was to provide a detailed examination of relevant ethics and regulatory issues for both researchers and those who review their protocols. PMID:23679571

  7. Ethics of social media research: common concerns and practical considerations.

    PubMed

    Moreno, Megan A; Goniu, Natalie; Moreno, Peter S; Diekema, Douglas

    2013-09-01

    Social media Websites (SMWs) are increasingly popular research tools. These sites provide new opportunities for researchers, but raise new challenges for Institutional Review Boards (IRBs) that review these research protocols. As of yet, there is little-to-no guidance regarding how an IRB should review the studies involving SMWs. The purpose of this article was to review the common risks inherent in social media research and consider how researchers can consider these risks when writing research protocols. We focused this article on three common research approaches: observational research, interactive research, and survey/interview research. Concomitant with these research approaches, we gave particular attention to the issues pertinent to SMW research, including privacy, consent, and confidentiality. After considering these challenges, we outlined key considerations for both researchers and reviewers when creating or reviewing SMW IRB protocols. Our goal in this article was to provide a detailed examination of relevant ethics and regulatory issues for both researchers and those who review their protocols.

  8. [New developments in spastic unilateral cerebral palsy].

    PubMed

    Chabrier, S; Roubertie, A; Allard, D; Bonhomme, C; Gautheron, V

    2010-01-01

    Hemiplegic (or spastic unilateral) cerebral palsy accounts for about 30% of all cases of cerebral palsy. With a population prevalence of 0.6 per 1000 live births, it is the most common type of cerebral palsy among term-born children and the second most common type after diplegia among preterm infants. Many types of prenatal and perinatal brain injury can lead to congenital hemiplegia and brain MRI is the most useful tool to classify them with accuracy and to provide early prognostic information. Perinatal arterial ischemic stroke thus appears as the leading cause in term infants, whereas encephalopathy of prematurity is the most common cause in premature babies. Other causes include brain malformations, neonatal sinovenous thrombosis, parenchymal hemorrhage (for example due to coagulopathy or alloimmune thrombocytopenia) and the more recently described familial forms of porencephaly associated with mutations in the COL4A1 gene. In adjunction with pharmacologic treatment (botulinium neurotoxin injection), new evidence-based rehabilitational interventions, such as constraint-induced movement therapy and mirror therapy, are increasingly being used.

  9. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  10. Occupational Safety. Hand Tools. Pre-Apprenticeship Phase 1 Training.

    ERIC Educational Resources Information Center

    Lane Community Coll., Eugene, OR.

    This self-paced student training module on safety when using hand tools is one of a number of modules developed for Pre-apprenticeship Phase 1 Training. Purpose of the module is to teach students the correct safety techniques for operating common hand- and arm-powered tools, including selection, maintenance, technique, and uses. The module may…

  11. ICE: An Automated Tool for Teaching Advanced C Programming

    ERIC Educational Resources Information Center

    Gonzalez, Ruben

    2017-01-01

    There are many difficulties with learning and teaching programming that can be alleviated with the use of software tools. Most of these tools have focused on the teaching of introductory programming concepts where commonly code fragments or small user programs are run in a sandbox or virtual machine, often in the cloud. These do not permit user…

  12. FFI: What it is and what it can do for you

    Treesearch

    Duncan C. Lutes; MaryBeth Keifer; Nathan C. Benson; John F. Caratti

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool (FEAT). FFI provides...

  13. Indoor Air Quality Problem Solving Tool

    EPA Pesticide Factsheets

    Use the IAQ Problem Solving Tool to learn about the connection between health complaints and common solutions in schools. This resource provides an easy, step-by-step process to start identifying and resolving IAQ problems found at your school.

  14. PC graphics generation and management tool for real-time applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  15. [Plagiarism in medical schools, and its prevention].

    PubMed

    Annane, Djillali; Annane, Frédérique

    2012-09-01

    The plagiarism has become very common in universities and medical school. Undoubtedly, the easy access to a huge amount of electronic documents is one explanation for the increasing prevalence of plagiarism among students. While most of universities and medical school have clear statements and rules about plagiarism, available tools for the detection of plagiarism remain inefficient and dedicate training program for students and teachers too scarce. As lack of time is one reason for students to choose plagiarism, it should be one main target for educational programs. Copyright © 2012. Published by Elsevier Masson SAS.

  16. Fly-by-Wireless Update

    NASA Technical Reports Server (NTRS)

    Studor, George

    2010-01-01

    The presentation reviews what is meant by the term 'fly-by-wireless', common problems and motivation, provides recent examples, and examines NASA's future and basis for collaboration. The vision is to minimize cables and connectors and increase functionality across the aerospace industry by providing reliable, lower cost, modular, and higher performance alternatives to wired data connectivity to benefit the entire vehicle/program life-cycle. Focus areas are system engineering and integration methods to reduce cables and connectors, vehicle provisions for modularity and accessibility, and a 'tool box' of alternatives to wired connectivity.

  17. Applying business intelligence innovations to emergency management.

    PubMed

    Schlegelmilch, Jeffrey; Albanese, Joseph

    2014-01-01

    The use of business intelligence (BI) is common among corporations in the private sector to improve business decision making and create insights for competitive advantage. Increasingly, emergency management agencies are using tools and processes similar to BI systems. With a more thorough understanding of the principles of BI and its supporting technologies, and a careful comparison to the business model of emergency management, this paper seeks to provide insights into how lessons from the private sector can contribute to the development of effective and efficient emergency management BI utilisation.

  18. High Speed Metal Removal

    DTIC Science & Technology

    1982-10-01

    AISI 1340, 4140 , 4340, and HF-1) which are commonly used in large caliber projectile manufacture were machined at...Tool Load Data for AISI 1340 "finishing" cuts Life-Line Data for AISI 4140 "roughing" cuts Tool Wear-Land Chart Data for AISI 4140 - "roughing...34 cuts; 570 Ceramic Coated Carbide Tool Wear-Land Chart Data for AISI 4140 - "roughing" cuts; G-10 Ceramic- Tool Wear-Land Chart Data for AISI 4140

  19. The application of systems thinking in health: why use systems thinking?

    PubMed

    Peters, David H

    2014-08-26

    This paper explores the question of what systems thinking adds to the field of global health. Observing that elements of systems thinking are already common in public health research, the article discusses which of the large body of theories, methods, and tools associated with systems thinking are more useful. The paper reviews the origins of systems thinking, describing a range of the theories, methods, and tools. A common thread is the idea that the behavior of systems is governed by common principles that can be discovered and expressed. They each address problems of complexity, which is a frequent challenge in global health. The different methods and tools are suited to different types of inquiry and involve both qualitative and quantitative techniques. The paper concludes by emphasizing that explicit models used in systems thinking provide new opportunities to understand and continuously test and revise our understanding of the nature of things, including how to intervene to improve people's health.

  20. Risk determination after an acute myocardial infarction: review of 3 clinical risk prediction tools.

    PubMed

    Scruth, Elizabeth Ann; Page, Karen; Cheng, Eugene; Campbell, Michelle; Worrall-Carter, Linda

    2012-01-01

    The objective of the study was to provide comprehensive information for the clinical nurse specialist (CNS) on commonly used clinical prediction (risk assessment) tools used to estimate risk of a secondary cardiac or noncardiac event and mortality in patients undergoing primary percutaneous coronary intervention (PCI) for ST-elevation myocardial infarction (STEMI). The evolution and widespread adoption of primary PCI represent major advances in the treatment of acute myocardial infarction, specifically STEMI. The American College of Cardiology and the American Heart Association have recommended early risk stratification for patients presenting with acute coronary syndromes using several clinical risk scores to identify patients' mortality and secondary event risk after PCI. Clinical nurse specialists are integral to any performance improvement strategy. Their knowledge and understandings of clinical prediction tools will be essential in carrying out important assessment, identifying and managing risk in patients who have sustained a STEMI, and enhancing discharge education including counseling on medications and lifestyle changes. Over the past 2 decades, risk scores have been developed from clinical trials to facilitate risk assessment. There are several risk scores that can be used to determine in-hospital and short-term survival. This article critiques the most common tools: the Thrombolytic in Myocardial Infarction risk score, the Global Registry of Acute Coronary Events risk score, and the Controlled Abciximab and Device Investigation to Lower Late Angioplasty Complications risk score. The importance of incorporating risk screening assessment tools (that are important for clinical prediction models) to guide therapeutic management of patients cannot be underestimated. The ability to forecast secondary risk after a STEMI will assist in determining which patients would require the most aggressive level of treatment and monitoring postintervention including outpatient monitoring. With an increased awareness of specialist assessment tools, the CNS can play an important role in risk prevention and ongoing cardiovascular health promotion in patients diagnosed with STEMI. Knowledge of clinical prediction tools to estimate risk for mortality and risk of secondary events after PCI for acute coronary syndromes including STEMI is essential for the CNS in assisting with improving short- and long-term outcomes and for performance improvement strategies. The risk score assessment utilizing a collaborative approach with the multidisciplinary healthcare team provides for the development of a treatment plan including any invasive intervention strategy for the patient. Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins

  1. Tools for language: patterned iconicity in sign language nouns and verbs.

    PubMed

    Padden, Carol; Hwang, So-One; Lepic, Ryan; Seegers, Sharon

    2015-01-01

    When naming certain hand-held, man-made tools, American Sign Language (ASL) signers exhibit either of two iconic strategies: a handling strategy, where the hands show holding or grasping an imagined object in action, or an instrument strategy, where the hands represent the shape or a dimension of the object in a typical action. The same strategies are also observed in the gestures of hearing nonsigners identifying pictures of the same set of tools. In this paper, we compare spontaneously created gestures from hearing nonsigning participants to commonly used lexical signs in ASL. Signers and gesturers were asked to respond to pictures of tools and to video vignettes of actions involving the same tools. Nonsigning gesturers overwhelmingly prefer the handling strategy for both the Picture and Video conditions. Nevertheless, they use more instrument forms when identifying tools in pictures, and more handling forms when identifying actions with tools. We found that ASL signers generally favor the instrument strategy when naming tools, but when describing tools being used by an actor, they are significantly more likely to use more handling forms. The finding that both gesturers and signers are more likely to alternate strategies when the stimuli are pictures or video suggests a common cognitive basis for differentiating objects from actions. Furthermore, the presence of a systematic handling/instrument iconic pattern in a sign language demonstrates that a conventionalized sign language exploits the distinction for grammatical purpose, to distinguish nouns and verbs related to tool use. Copyright © 2014 Cognitive Science Society, Inc.

  2. Management of Depression in Older Adults: A Review.

    PubMed

    Kok, Rob M; Reynolds, Charles F

    2017-05-23

    Depression in older adults is a common psychiatric disorder affecting their health-related quality of life. Major depression occurs in 2% of adults aged 55 years or older, and its prevalence rises with increasing age. In addition, 10% to 15% of older adults have clinically significant depressive symptoms, even in the absence of major depression. Depression presents with the same symptoms in older adults as it does in younger populations. In contrast to younger patients, older adults with depression more commonly have several concurrent medical disorders and cognitive impairment. Depression occurring in older patients is often undetected or inadequately treated. Antidepressants are the best-studied treatment option, but psychotherapy, exercise therapy, and electroconvulsive therapy may also be effective. Psychotherapy is recommended for patients with mild to moderate severity depression. Many older patients need the same doses of antidepressant medication that are used for younger adult patients. Although antidepressants may effectively treat depression in older adults, they tend to pose greater risk for adverse events because of multiple medical comorbidities and drug-drug interactions in case of polypharmacy. High-quality evidence does not support the use of pharmacologic treatment of depression in patients with dementia. Polypharmacy in older patients can be minimized by using the Screening Tool of Older Persons Prescriptions and Screening Tool to Alert doctors to Right Treatment (STOPP/START) criteria, a valid and reliable screening tool that enables physicians to avoid potentially inappropriate medications, undertreatment, or errors of omissions in older people. Antidepressants can be gradually tapered over a period of several weeks, but discontinuation of antidepressants may be associated with relapse or recurrence of depression, so the patient should be closely observed. Major depression in older adults is common and can be effectively treated with antidepressants and electroconvulsive therapy. Psychological therapies and exercise may also be effective for mild-moderate depression, for patients who prefer nonpharmacological treatment, or for patients who are too frail for drug treatments.

  3. Monitoring Evolution at CERN

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Fiorini, B.; Murphy, S.; Pigueiras, L.; Santos, M.

    2015-12-01

    Over the past two years, the operation of the CERN Data Centres went through significant changes with the introduction of new mechanisms for hardware procurement, new services for cloud provisioning and configuration management, among other improvements. These changes resulted in an increase of resources being operated in a more dynamic environment. Today, the CERN Data Centres provide over 11000 multi-core processor servers, 130 PB disk servers, 100 PB tape robots, and 150 high performance tape drives. To cope with these developments, an evolution of the data centre monitoring tools was also required. This modernisation was based on a number of guiding rules: sustain the increase of resources, adapt to the new dynamic nature of the data centres, make monitoring data easier to share, give more flexibility to Service Managers on how they publish and consume monitoring metrics and logs, establish a common repository of monitoring data, optimise the handling of monitoring notifications, and replace the previous toolset by new open source technologies with large adoption and community support. This contribution describes how these improvements were delivered, present the architecture and technologies of the new monitoring tools, and review the experience of its production deployment.

  4. Social media in public health.

    PubMed

    Kass-Hout, Taha A; Alhinnawi, Hend

    2013-01-01

    While social media interactions are currently not fully understood, as individual health behaviors and outcomes are shared online, social media offers an increasingly clear picture of the dynamics of these processes. Social media is becoming an increasingly common platform among clinicians and public health officials to share information with the public, track or predict diseases. Social media can be used for engaging the public and communicating key public health interventions, while providing an important tool for public health surveillance. Social media has advantages over traditional public health surveillance, as well as limitations, such as poor specificity, that warrant additional study. Social media can provide timely, relevant and transparent information of public health importance; such as tracking or predicting the spread or severity of influenza, west nile virus or meningitis as they propagate in the community, and, in identifying disease outbreaks or clusters of chronic illnesses. Further work is needed on social media as a valid data source for detecting or predicting diseases or conditions. Also, whether or not it is an effective tool for communicating key public health messages and engaging both, the general public and policy-makers.

  5. A social science data-fusion tool and the Data Management through e-Social Science (DAMES) infrastructure.

    PubMed

    Warner, Guy C; Blum, Jesse M; Jones, Simon B; Lambert, Paul S; Turner, Kenneth J; Tan, Larry; Dawson, Alison S F; Bell, David N F

    2010-08-28

    The last two decades have seen substantially increased potential for quantitative social science research. This has been made possible by the significant expansion of publicly available social science datasets, the development of new analytical methodologies, such as microsimulation, and increases in computing power. These rich resources do, however, bring with them substantial challenges associated with organizing and using data. These processes are often referred to as 'data management'. The Data Management through e-Social Science (DAMES) project is working to support activities of data management for social science research. This paper describes the DAMES infrastructure, focusing on the data-fusion process that is central to the project approach. It covers: the background and requirements for provision of resources by DAMES; the use of grid technologies to provide easy-to-use tools and user front-ends for several common social science data-management tasks such as data fusion; the approach taken to solve problems related to data resources and metadata relevant to social science applications; and the implementation of the architecture that has been designed to achieve this infrastructure.

  6. Finite-element-based matching of pre- and intraoperative data for image-guided endovascular aneurysm repair

    PubMed Central

    Dumenil, Aurélien; Kaladji, Adrien; Castro, Miguel; Esneault, Simon; Lucas, Antoine; Rochette, Michel; Goksu, Cemil; Haigron, Pascal

    2013-01-01

    Endovascular repair of abdominal aortic aneurysms is a well-established technique throughout the medical and surgical communities. Although increasingly indicated, this technique does have some limitations. Because intervention is commonly performed under fluoroscopic control, two-dimensional (2D) visualization of the aneurysm requires the injection of a contrast agent. The projective nature of this imaging modality inevitably leads to topographic errors, and does not give information on arterial wall quality at the time of deployment. A specially-adapted intraoperative navigation interface could increase deployment accuracy and reveal such information, which preoperative three-dimensional (3D) imaging might otherwise provide. One difficulty is the precise matching of preoperative data (images and models) and intraoperative observations affected by anatomical deformations due to tool-tissue interactions. Our proposed solution involves a finite element-based preoperative simulation of tool/tissue interactions, its adaptive tuning regarding patient specific data, and the matching with intra-operative data. The biomechanical model was first tuned on a group of 10 patients and assessed on a second group of 8 patients. PMID:23269745

  7. Simulation in Nursing Education: iPod As a Teaching Tool for Undergraduate Nurses.

    PubMed

    Evans, Jennifer; Webster, Sue; Gallagher, Susan; Brown, Peter; Sinclair, John

    2015-07-01

    Most people with psychosis and schizophrenia experience auditory hallucinations, particularly the hearing of voices. A common cause of frustration and alienation for consumers is the lack of understanding by therapists, family members and caregivers, who find it difficult to relate to the consumers' experiences. The purpose of this study is to examine and evaluate whether students' participation in a simulated auditory hallucination will increase their understanding and knowledge about psychosis and auditory hallucinations. The design method consisted of a lecture on psychosis and schizophrenia disorders, followed by a simulation of auditory hallucinations using iPods. Students' knowledge and perceptions of psychosis and hallucinations was assessed using quasi-experimental pre-post matched-design questionnaires. The questionnaire was divided into two parts, the first comprised closed questions to assess students' knowledge, and the second part consisted of open-ended questions to collect information about students' perceptions of auditory hallucinations. The results confirmed that students' knowledge of psychosis and hallucination increased following the teaching session and simulation is a useful tool to prepare students for clinical placements in mental health practice.

  8. Data Publication and Interoperability for Long Tail Researchers via the Open Data Repository's (ODR) Data Publisher.

    NASA Astrophysics Data System (ADS)

    Stone, N.; Lafuente, B.; Bristow, T.; Keller, R.; Downs, R. T.; Blake, D. F.; Fonda, M.; Pires, A.

    2016-12-01

    Working primarily with astrobiology researchers at NASA Ames, the Open Data Repository (ODR) has been conducting a software pilot to meet the varying needs of this multidisciplinary community. Astrobiology researchers often have small communities or operate individually with unique data sets that don't easily fit into existing database structures. The ODR constructed its Data Publisher software to allow researchers to create databases with common metadata structures and subsequently extend them to meet their individual needs and data requirements. The software accomplishes these tasks through a web-based interface that allows collaborative creation and revision of common metadata templates and individual extensions to these templates for custom data sets. This allows researchers to search disparate datasets based on common metadata established through the metadata tools, but still facilitates distinct analyses and data that may be stored alongside the required common metadata. The software produces web pages that can be made publicly available at the researcher's discretion so that users may search and browse the data in an effort to make interoperability and data discovery a human-friendly task while also providing semantic data for machine-based discovery. Once relevant data has been identified, researchers can utilize the built-in application programming interface (API) that exposes the data for machine-based consumption and integration with existing data analysis tools (e.g. R, MATLAB, Project Jupyter - http://jupyter.org). The current evolution of the project has created the Astrobiology Habitable Environments Database (AHED)[1] which provides an interface to databases connected through a common metadata core. In the next project phase, the goal is for small research teams and groups to be self-sufficient in publishing their research data to meet funding mandates and academic requirements as well as fostering increased data discovery and interoperability through human-readable and machine-readable interfaces. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, MSL. [1] B. Lafuente et al. (2016) AGU, submitted.

  9. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  10. How to sharpen your automated tools.

    DOT National Transportation Integrated Search

    2014-12-01

    New programs that claim to make flying more efficient have several things in common, new tasks for pilots, new flight deck displays, automated support tools, changes to ground automation, and displays for air traffic control. Training is one of the t...

  11. CHARGE syndrome: a recurrent hotspot of mutations in CHD7 IVS25 analyzed by bioinformatic tools and minigene assays.

    PubMed

    Legendre, Marine; Rodriguez-Ballesteros, Montserrat; Rossi, Massimiliano; Abadie, Véronique; Amiel, Jeanne; Revencu, Nicole; Blanchet, Patricia; Brioude, Frédéric; Delrue, Marie-Ange; Doubaj, Yassamine; Sefiani, Abdelaziz; Francannet, Christine; Holder-Espinasse, Muriel; Jouk, Pierre-Simon; Julia, Sophie; Melki, Judith; Mur, Sébastien; Naudion, Sophie; Fabre-Teste, Jennifer; Busa, Tiffany; Stamm, Stephen; Lyonnet, Stanislas; Attie-Bitach, Tania; Kitzis, Alain; Gilbert-Dussardier, Brigitte; Bilan, Frédéric

    2018-02-01

    CHARGE syndrome is a rare genetic disorder mainly due to de novo and private truncating mutations of CHD7 gene. Here we report an intriguing hot spot of intronic mutations (c.5405-7G > A, c.5405-13G > A, c.5405-17G > A and c.5405-18C > A) located in CHD7 IVS25. Combining computational in silico analysis, experimental branch-point determination and in vitro minigene assays, our study explains this mutation hot spot by a particular genomic context, including the weakness of the IVS25 natural acceptor-site and an unconventional lariat sequence localized outside the common 40 bp upstream the acceptor splice site. For each of the mutations reported here, bioinformatic tools indicated a newly created 3' splice site, of which the existence was confirmed using pSpliceExpress, an easy-to-use and reliable splicing reporter tool. Our study emphasizes the idea that combining these two complementary approaches could increase the efficiency of routine molecular diagnosis.

  12. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    2017-09-01

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.

  13. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  14. Transputer parallel processing at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1989-01-01

    The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.

  15. Scientists' sense making when hypothesizing about disease mechanisms from expression data and their needs for visualization support.

    PubMed

    Mirel, Barbara; Görg, Carsten

    2014-04-26

    A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists' analytical workflows and their implications for tool design.

  16. Scientists’ sense making when hypothesizing about disease mechanisms from expression data and their needs for visualization support

    PubMed Central

    2014-01-01

    A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists’ analytical workflows and their implications for tool design. PMID:24766796

  17. SMART micro-scissors with dual motors and OCT sensors (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yeo, Chaebeom; Jang, Seonjin; Park, Hyun-cheol; Gehlbach, Peter L.; Song, Cheol

    2017-02-01

    Various end-effectors of microsurgical instruments have been developed and studied. Also, many approaches to stabilize the tool-tip using robotics have been studied such as the steady hand robot system, Micron, and SMART system. In our previous study, the horizontal SMART micro-scissors with a common path swept source OCT distance and one linear piezoelectric (PZT) motor was demonstrated as a microsurgical system. Because the outer needle is connected with a mechanical handle and moved to engage the tool tip manually, the tool tip position is instantaneously changed during the engaging. The undesirable motion can make unexpected tissue damages and low surgical accuracy. In this study, we suggest a prototype horizontal SMART micro-scissors which has dual OCT sensors and two motors to improve the tremor cancellation. Dual OCT sensors provide two distance information. Front OCT sensor detects a distance from the sample surface to the tool tip. Rear OCT sensors gives current PZT motor movement, acting like a motor encoder. The PZT motor can compensate the hand tremor with a feedback loop control. The manual engaging of tool tip in previous SMART system is replaced by electrical engaging using a squiggle motor. Compared with previous study, this study showed better performance in the hand tremor reduction. From the result, the SMART with automatic engaging may become increasingly valuable in microsurgical instruments.

  18. Common Effects Methodology for Pesticides

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  19. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  20. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.

    PubMed

    Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin

    2016-02-01

    As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.

  1. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  2. The Toxicology Investigators Consortium Case Registry--the 2011 experience.

    PubMed

    Wiegand, Timothy J; Wax, Paul M; Schwartz, Tayler; Finkelstein, Yaron; Gorodetsky, Rachel; Brent, Jeffrey

    2012-12-01

    In 2010, the American College of Medical Toxicology established its Case Registry, the Toxicology Investigators Consortium (ToxIC). ToxIC is a prospective registry, which exclusively compiles suspected and confirmed toxic exposure cases cared for at the bedside by medical toxicologists at its participating sites. The Registry aims to fulfill two important gaps in the field: a real-time toxicosurveillance system to identify current poisoning trends and a powerful research tool in toxicology. ToxIC allows extraction of information from medical records making it the most robust multicenter database on chemical toxicities in existence. All cases seen by medical toxicologists at participating institutions were entered in a database. Information characterizing patients entered in 2011 was tabulated. 2010 data was also included so that cumulative total numbers could be described as well. The current report is a summary of the data collected in 2011 in comparison to 2010 entries and also includes cumulative data through December 31st, 2011. During 2011, 28 sites with 49 specific institutions contributed a total of 6,456 cases to the Registry. The total number of cases entered into the registry at the end of 2011 was 10,392. Emergency departments remained the most common source of consultations in 2011, accounting for 53 % of cases. The most common reason for consultation was for pharmaceutical overdoses, which occurred in 48 % of patients, including intentional (37 %) and unintentional (11 %) exposures. The most common classes of agents were sedative-hypnotics (1,492 entries in 23 % of cases), non-opioid analgesics (1,368 cases in 21 % of cases), opioids (17 %), antidepressants (16 %), stimulants/sympathomimetics (12 %), and ethanol (8 %). N-acetylcysteine was the most commonly administered antidote during 2011, similar to 2010, followed by the opioid antagonist naloxone, sodium bicarbonate, physostigmine and flumazenil. Anti-crotalid Fab fragments (CroFab) were administered in 106 out of 131 cases in which an envenomation occurred. There were 35 deaths recorded in the Registry during 2011. The most common associated agents, including when reported as sole agent or in combination with other agents, were opioids and analgesics (acetaminophen, aspirin, NSAIDS) with ten and eight deaths, respectively. Oxycodone was reported in six of the ten opioid-related deaths and heroin in three. Acetaminophen was the most common single agent reported overall being identified in all eight of the death cases attributed to analgesics. There were significant trends identified during 2011. Cases involving designer drugs including psychoactive bath salts and synthetic cannabinoids increased substantially from 2010 to 2011. The psychoactive bath salts were responsible for a large increase in stimulant/sympathomimetic-related cases reported to the Registry in 2011 with overall numbers doubling from 6 % of all Registry entries in 2010 to 12 % in 2011. Entries involving psychoactive drugs of abuse also increased twofold from 2010 to 2011 jumping 3 to 6 %, primarily due to increasing frequency of synthetic cannabinoid ("K2") related intoxications as 2011 progressed. The 2011 Registry included over 600 ADR's (10 % of Registry Cases) with 115 agents causing at least 2 ADR's. This is up from only 3 % of cases (116 total cases) in 2010. The ToxIC Case Registry continues to grow. At the end of 2011, over 10,000 cases had been entered into the Registry. As demonstrated by the trends identified in psychoactive bath salt and synthetic cannabinoid reports, the Registry is a valuable toxicosurveillance and research tool. The ToxIC Registry is a unique tool for identifying and characterizing confirmed cases of significant or potential toxicity or complexity to require bedside consultation by a medical toxicologist.

  3. Using an implicitly-coupled hydrologic and river-operations models to investigate the trade-offs of artificial recharge in agricultural areas

    NASA Astrophysics Data System (ADS)

    Morway, E. D.; Niswonger, R. G.; Triana, E.

    2016-12-01

    In irrigated agricultural regions supplied by both surface-water and groundwater, increased reliance on groundwater during sustained drought leads to long-term water table drawdown and subsequent surface-water losses. This, in turn, may threaten the sustainability of the irrigation project. To help offset groundwater resource losses and restore water supply reliability, an alternative management strategy commonly referred to as managed aquifer recharge (MAR) in agricultural regions helps mitigate long-term aquifer drawdown and provides additional water for subsequent withdraw. Sources of MAR in this investigation are limited to late winter runoff in years with above average precipitation (i.e., above average snowpack). However, where winter MAR results in an elevated water table, non-beneficial consumptive use may increase from evapotranspiration in adjacent and down-gradient fallow and naturally vegetated lands. To rigorously explore this trade-off, the recently published MODSIM-MODFLOW model was applied to quantify both the benefits and unintended consequences of MAR. MODSIM-MODFLOW is a generalized modeling tool capable of exploring the effects of altered river operations within an integrated groundwater and surface-water (GW-SW) model. Thus, the MODSIM-MODFLOW model provides a modeling platform capable of simulating MAR in amounts and duration consistent with other senior water rights in the river system (e.g., minimum in-stream flow requirements). Increases in non-beneficial consumptive use resulting from winter MAR are evaluated for a hypothetical model patterned after alluvial aquifers common in arid and semi-arid areas of the western United States. Study results highlight (1) the benefit of an implicitly-coupled river operations and hydrologic modeling tool, (2) the balance between winter MAR and the potential increase in non-beneficial consumptive use, and (3) conditions where MAR may or may not be an appropriate management option, such as the availability of surface-water storage.

  4. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    ERIC Educational Resources Information Center

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  5. EUV tools: hydrogen gas purification and recovery strategies

    NASA Astrophysics Data System (ADS)

    Landoni, Cristian; Succi, Marco; Applegarth, Chuck; Riddle Vogt, Sarah

    2015-03-01

    The technological challenges that have been overcome to make extreme ultraviolet lithography (EUV) a reality have been enormous1. This vacuum driven technology poses significant purity challenges for the gases employed for purging and cleaning the scanner EUV chamber and source. Hydrogen, nitrogen, argon and ultra-high purity compressed dry air (UHPCDA) are the most common gases utilized at the scanner and source level. Purity requirements are tighter than for previous technology node tools. In addition, specifically for hydrogen, EUV tool users are facing not only gas purity challenges but also the need for safe disposal of the hydrogen at the tool outlet. Recovery, reuse or recycling strategies could mitigate the disposal process and reduce the overall tool cost of operation. This paper will review the types of purification technologies that are currently available to generate high purity hydrogen suitable for EUV applications. Advantages and disadvantages of each purification technology will be presented. Guidelines on how to select the most appropriate technology for each application and experimental conditions will be presented. A discussion of the most common approaches utilized at the facility level to operate EUV tools along with possible hydrogen recovery strategies will also be reported.

  6. Effects on Training Using Illumination in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian

    1999-01-01

    Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.

  7. Do Global Indicators of Protected Area Management Effectiveness Make Sense? A Case Study from Siberia

    NASA Astrophysics Data System (ADS)

    Anthony, Brandon P.; Shestackova, Elena

    2015-07-01

    Driven by the underperformance of many protected areas (PAs), protected area management effectiveness (PAME) evaluations are increasingly being conducted to assess PAs in meeting specified objectives. A number of PAME tools have been developed, many of which are based on the IUCN-WCPA framework constituting six evaluative elements (context, planning, input, process, output, and outcomes). In a quest for a more universal tool and using this framework, Leverington et al. (Environ Manag 46(5):685-698, 2010) developed a common scale and list of 33 headline indicators, purported to be representative across a wide range of management effectiveness evaluation tools. The usefulness of such composite tools and the relative weighting of indicators are still being debated. Here, we utilize these headline indicators as a benchmark to assess PAME in 37 PAs of four types in Krasnoyarsk Kray, Russia, and compare these with global results. Moreover, we review the usefulness of these indicators in the Krasnoyarsk context based on the opinions of local PA management teams. Overall, uncorrected management scores for studied PAs were slightly better (mean = 5.66 ± 0.875) than the global average, with output and outcome elements being strongest, and planning and process scores lower. Score variability is influenced by PA size, location, and type. When scores were corrected based on indicator importance, the mean score significantly increased to 5.75 ± 0.858. We emphasize idiosyncrasies of Russian PA management, including the relative absence of formal management plans and limited efforts toward local community beneficiation, and how such contextual differences may confound PAME scores when indicator weights are treated equal.

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  9. Identification of a rare variant haemoglobin (Hb Sinai-Baltimore) causing spuriously low haemoglobin A(1c) values on ion exchange chromatography.

    PubMed

    Smith, Geoff; Murray, Heather; Brennan, Stephen O

    2013-01-01

    Commonly used methods for assay of haemoglobin A(1c) (HbA(1c)) are susceptible to interference from the presence of haemoglobin variants. In many systems, the common variants can be identified but scientists and pathologists must remain vigilant for more subtle variants that may result in spuriously high or low HbA(1c) values. It is clearly important to recognize these events whether HbA(1c) is being used as a monitoring tool or, as is increasingly the case, for diagnostic purposes. We report a patient with a rare haemoglobin variant (Hb Sinai-Baltimore) that resulted in spuriously low values of HbA(1c) when assayed using ion exchange chromatography, and the steps taken to elucidate the nature of the variant.

  10. More than words: Using visual graphics for community-based health research.

    PubMed

    Morton Ninomiya, Melody E

    2017-04-20

    With increased attention to knowledge translation and community engagement in the applied health research field, many researchers aim to find effective ways of engaging health policy and decision makers and community stakeholders. While visual graphics such as graphs, charts, figures and photographs are common in scientific research dissemination, they are less common as a communication tool in research. In this commentary, I illustrate how and why visual graphics were created and used to facilitate dialogue and communication throughout all phases of a community-based health research study with a rural Indigenous community, advancing community engagement and knowledge utilization of a research study. I suggest that it is essential that researchers consider the use of visual graphics to accurately communicate and translate important health research concepts and content in accessible forms for diverse research stakeholders and target audiences.

  11. MiRduplexSVM: A High-Performing MiRNA-Duplex Prediction and Evaluation Methodology

    PubMed Central

    Karathanasis, Nestoras; Tsamardinos, Ioannis; Poirazi, Panayiota

    2015-01-01

    We address the problem of predicting the position of a miRNA duplex on a microRNA hairpin via the development and application of a novel SVM-based methodology. Our method combines a unique problem representation and an unbiased optimization protocol to learn from mirBase19.0 an accurate predictive model, termed MiRduplexSVM. This is the first model that provides precise information about all four ends of the miRNA duplex. We show that (a) our method outperforms four state-of-the-art tools, namely MaturePred, MiRPara, MatureBayes, MiRdup as well as a Simple Geometric Locator when applied on the same training datasets employed for each tool and evaluated on a common blind test set. (b) In all comparisons, MiRduplexSVM shows superior performance, achieving up to a 60% increase in prediction accuracy for mammalian hairpins and can generalize very well on plant hairpins, without any special optimization. (c) The tool has a number of important applications such as the ability to accurately predict the miRNA or the miRNA*, given the opposite strand of a duplex. Its performance on this task is superior to the 2nts overhang rule commonly used in computational studies and similar to that of a comparative genomic approach, without the need for prior knowledge or the complexity of performing multiple alignments. Finally, it is able to evaluate novel, potential miRNAs found either computationally or experimentally. In relation with recent confidence evaluation methods used in miRBase, MiRduplexSVM was successful in identifying high confidence potential miRNAs. PMID:25961860

  12. A web-based simulation of a longitudinal clinic used in a 4-week ambulatory rotation: a cohort study

    PubMed Central

    Wong, Rene WG; Lochnan, Heather A

    2009-01-01

    Background Residency training takes place primarily on inpatient wards. In the absence of a resident continuity clinic, internal medicine residents rely on block rotations to learn about continuity of care. Alternate methods to introduce continuity of care are needed. Methods A web-based tool, Continuity of Care Online Simulations (COCOS), was designed for use in a one-month, postgraduate clinical rotation in endocrinology. It is an interactive tool that simulates the continuing care of any patient with a chronic endocrine disease. Twenty-three residents in internal medicine participated in a study to investigate the effects of using COCOS during a clinical rotation in endocrinology on pre-post knowledge test scores and self-assessment of confidence. Results Compared to residents who did the rotation alone, residents who used COCOS during the rotation had significantly higher improvements in test scores (% increase in pre-post test scores +21.6 [standard deviation, SD, 8.0] vs. +5.9 [SD 6.8]; p < .001). Test score improvements were most pronounced for less commonly seen conditions. There were no significant differences in changes in confidence. Residents rated COCOS very highly, recommending its use as a standard part of the rotation and throughout residency. Conclusion A stand-alone web-based tool can be incorporated into an existing clinical rotation to help residents learn about continuity of care. It has the most potential to teach residents about topics that are less commonly seen during a clinical rotation. The adaptable, web-based format allows the creation of cases for most chronic medical conditions. PMID:19187554

  13. AE Monitoring of Diamond Turned Rapidly Soldified Aluminium 443

    NASA Astrophysics Data System (ADS)

    Onwuka, G.; Abou-El-Hossein, K.; Mkoko, Z.

    2017-05-01

    The fast replacement of conventional aluminium with rapidly solidified aluminium alloys has become a noticeable trend in the current manufacturing industries involved in the production of optics and optical molding inserts. This is as a result of the improved performance and durability of rapidly solidified aluminium alloys when compared to conventional aluminium. Melt spinning process is vital for manufacturing rapidly solidified aluminium alloys like RSA 905, RSA 6061 and RSA 443 which are common in the industries today. RSA 443 is a newly developed alloy with few research findings and huge research potential. There is no available literature focused on monitoring the machining of RSA 443 alloys. In this research, Acoustic Emission sensing technique was applied to monitor the single point diamond turning of RSA 443 on an ultrahigh precision lathe machine. The machining process was carried out after careful selection of feed, speed and depths of cut. The monitoring process was achieved with a high sampling data acquisition system using different tools while concurrent measurement of the surface roughness and tool wear were initiated after covering a total feed distance of 13km. An increasing trend of raw AE spikes and peak to peak signal were observed with an increase in the surface roughness and tool wear values. Hence, acoustic emission sensing technique proves to be an effective monitoring method for the machining of RSA 443 alloy.

  14. Research Techniques Made Simple: Web-Based Survey Research in Dermatology: Conduct and Applications.

    PubMed

    Maymone, Mayra B C; Venkatesh, Samantha; Secemsky, Eric; Reddy, Kavitha; Vashi, Neelam A

    2018-07-01

    Web-based surveys, or e-surveys, are surveys designed and delivered using the internet. The use of these survey tools is becoming increasingly common in medical research. Their advantages are appealing to surveyors because they allow for rapid development and administration of surveys, fast data collection and analysis, low cost, and fewer errors due to manual data entry than telephone or mailed questionnaires. Internet surveys may be used in clinical and academic research settings with improved speed and efficacy of data collection compared with paper or verbal survey modalities. However, limitations such as potentially low response rates, demographic biases, and variations in computer literacy and internet access remain areas of concern. We aim to briefly describe some of the currently available Web-based survey tools, focusing on advantages and limitations to help guide their use and application in dermatologic research. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Understanding Digital Note-Taking Practice for Visualization.

    PubMed

    Willett, Wesley; Goffin, Pascal; Isenberg, Petra

    2015-05-13

    We present results and design implications from a study of digital note-taking practice to examine how visualization can support revisitation, reflection, and collaboration around notes. As digital notebooks become common forms of external memory, keeping track of volumes of content is increasingly difficult. Information visualization tools can help give note-takers an overview of their content and allow them to explore diverse sets of notes, find and organize related content, and compare their notes with their collaborators. To ground the design of such tools, we conducted a detailed mixed-methods study of digital note-taking practice. We identify a variety of different editing, organization, and sharing methods used by digital note-takers, many of which result in notes becoming "lost in the pile''. These findings form the basis for our design considerations that examine how visualization can support the revisitation, organization, and sharing of digital notes.

  16. Cinematherapy and film as an educational tool in undergraduate psychiatry teaching: a case report and review of the literature.

    PubMed

    Hankir, Ahmed; Holloway, David; Zaman, Rashid; Agius, Mark

    2015-09-01

    Film possesses an extraordinary power and offers an unrivalled medium for entertainment and escapism. There are many films that revolve around a mental illness theme and the medical specialty that most commonly features in motion picture is psychiatry. Over the last few decades films have become increasingly used as an educational tool in the teaching of psychiatry topics such as mental state examination to undergraduate students. Above and beyond its utility in pedagogy, film also has the power to heal and the term cinematherapy has been coined to reflect this. Indeed, there are case studies of people with first-hand experience of psychopathology who report that watching films with a mental illness theme has contributed to their recovery. We provide a first person narrative from an individual with schizophrenia in which he expounds on the concepts of cinematherpy and metaphorical imagery in films which theme on psychosis.

  17. GPS as a tool used in tourism as illustrated by selected mobile applications

    NASA Astrophysics Data System (ADS)

    Szark-Eckardt, Mirosława

    2017-11-01

    Mobile technologies have permanently changed our way of life. Their availability, common use and introducing to virtually all areas of human activity means that we can call present times the age of mobility [1]. Mobile applications based on the GPS module belong to the most dynamically developing apps as particularly reflected in tourism. A multitude of applications dedicated to different participants of tourism, which can be operated by means of smartphones or simple GPS trackers, are encouraging more people to reach for this kind of technology perceiving it as a basic tool used in today's tourism. Due to an increasingly wider access to mobile applications, not only more dynamic development of tourism itself can be noticed, but also the growth of healthy behaviours that comprise a positive "side effect" of tourism based on mobile technology. This article demonstrates a correlation between health and physical condition of the population and the use of mobile applications.

  18. Mobile patient applications within diabetes - from few and easy to advanced functionalities.

    PubMed

    Årsand, Eirik; Skrøvseth, Stein Olav; Hejlesen, Ole; Horsch, Alexander; Godtliebsen, Fred; Grøttland, Astrid; Hartvigsen, Gunnar

    2013-01-01

    Patient diaries as apps on mobile phones are becoming increasingly common, and can be a good support tool for patients who need to organize information relevant for their disease. Self-management is important to achieving diabetes treatment goals and can be a tool for lifestyle changes for patients with Type 2 diabetes. The autoimmune disease Type 1 diabetes requires a more intensive management than Type 2 - thus more advanced functionalities is desirable for users. Both simple and easy-to-use and more advanced diaries have their respective benefits, depending on the target user group and intervention. In this poster we summarize main findings and experience from more than a decade of research and development in the diabetes area. Several versions of the mobile health research platform-the Few Touch Application (FTA) are presented to illustrate the different approaches and results.

  19. A phylogenetic transform enhances analysis of compositional microbiota data.

    PubMed

    Silverman, Justin D; Washburne, Alex D; Mukherjee, Sayan; David, Lawrence A

    2017-02-15

    Surveys of microbial communities (microbiota), typically measured as relative abundance of species, have illustrated the importance of these communities in human health and disease. Yet, statistical artifacts commonly plague the analysis of relative abundance data. Here, we introduce the PhILR transform, which incorporates microbial evolutionary models with the isometric log-ratio transform to allow off-the-shelf statistical tools to be safely applied to microbiota surveys. We demonstrate that analyses of community-level structure can be applied to PhILR transformed data with performance on benchmarks rivaling or surpassing standard tools. Additionally, by decomposing distance in the PhILR transformed space, we identified neighboring clades that may have adapted to distinct human body sites. Decomposing variance revealed that covariation of bacterial clades within human body sites increases with phylogenetic relatedness. Together, these findings illustrate how the PhILR transform combines statistical and phylogenetic models to overcome compositional data challenges and enable evolutionary insights relevant to microbial communities.

  20. The use of 3D planning in facial surgery: preliminary observations.

    PubMed

    Hoarau, R; Zweifel, D; Simon, C; Broome, M

    2014-12-01

    Three-dimensional (3D) planning is becoming a more commonly used tool in maxillofacial surgery. At first used only virtually, 3D planning now also enables the creation of useful intraoperative aids such as cutting guides, which decrease the operative difficulty. In our center, we have used 3D planning in various domains of facial surgery and have investigated the advantages of this technique. We have also addressed the difficulties associated with its use. 3D planning increases the accuracy of reconstructive surgery, decreases operating time, whilst maintaining excellent esthetic results. However, its use is restricted to osseous reconstruction at this stage and once planning has been undertaken, it cannot be reversed or altered intraoperatively. Despite the attractive nature of this new tool, its uses and practicalities must be further evaluated. In particular, cost-effectiveness, hospital stay, and patient perceived benefits must be assessed. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  1. Large scale healthcare data integration and analysis using the semantic web.

    PubMed

    Timm, John; Renly, Sondra; Farkash, Ariel

    2011-01-01

    Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.

  2. Strategic planning as a tool for achieving alignment in academic health centers.

    PubMed

    Higginbotham, Eve J; Church, Kathryn C

    2012-01-01

    After the passage of the Patient Protection and Affordable Care Act in March 2010, there is an urgent need for medical schools, teaching hospitals, and practice plans to work together seamlessly across a common mission. Although there is agreement that there should be greater coordination of initiatives and resources, there is little guidance in the literature to address the method to achieve the necessary transformation. Traditional approaches to strategic planning often engage a few leaders and produce a set of immeasurable initiatives. A nontraditional approach, consisting of a Whole-Scale (Dannemiller Tyson Associates, Ann Arbor, MI) engagement, appreciative inquiry, and a balanced scorecard can, more rapidly transform an academic health center. Using this nontraditional approach to strategic planning, increased organizational awareness was achieved in a single academic health center. Strategic planning can be an effective tool to achieve alignment, enhance accountability, and a first step in meeting the demands of the new landscape of healthcare.

  3. [Beekeeping: study of organization, general hazards assessment, pre-assessment of risk for mechanical overload using a new tool for easy application].

    PubMed

    Ruschioni, Angela; Montesi, Simona; Spagnuolo, Loreta Maria; Rinaldi, Lucia; Fantozzi, Lucia; Fanti, M

    2011-01-01

    Beekeeping is common activity in the two regions in this study, Marche and Tuscany: in both regions the numbers of beekeepers, both amateur and professional, and honey production are high. The aim was to study, through the application of simple tools, the organization of beekeeping activity so as to identify hazardous situations in the work process. We followed the production cycle of two businesses that differed in size and work organization for a period of twelve months. Subsequently each homogeneous period was assessed via increasingly complex levels of intervention which made it possible to identify the work phases where preventive measures could be applied. The results obtained made it possible to detect the presence of risk situations for the musculoskeletal system of beekeepers. Organizational analysis in the two enterprises showed that is possible to apply easy solutions to improve safety and health at the workplace.

  4. Bringing "Scientific Expeditions" into the Schools

    NASA Technical Reports Server (NTRS)

    Watson, Val; Kutler, Paul (Technical Monitor)

    1994-01-01

    Schools can obtain scientific information over the information superhighway. However, information suppliers use formats that permit access and analysis by the "least common denominator" tools for access and analysis. The result: most sources of dynamic representations of science are in the format of flat movies. We can shorten the time to get the "scientific expeditions" into schools and provide a unifying focus to vendors and information suppliers by establishing a target and goals for the "least common denominator" for tools to be used to access and analyze information over the information superhighway.

  5. Identification of Patients at Risk for Hereditary Colorectal Cancer

    PubMed Central

    Mishra, Nitin; Hall, Jason

    2012-01-01

    Diagnosis of hereditary colorectal cancer syndromes requires clinical suspicion and knowledge of such syndromes. Lynch syndrome is the most common cause of hereditary colorectal cancer. Other less common causes include familial adenomatous polyposis (FAP), Peutz-Jeghers syndrome (PJS), juvenile polyposis syndrome, and others. There have been a growing number of clinical and molecular tools used to screen and test at risk individuals. Screening tools include diagnostic clinical criteria, family history, genetic prediction models, and tumor testing. Patients who are high risk based on screening should be referred for genetic testing. PMID:23730221

  6. Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) : An Open-source Tool to Compute Climate Statistic GIS Layers from Argonne Climate Modeling Results

    DTIC Science & Technology

    2017-08-01

    This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...emissions diverge from each other regarding fossil fuel use, technology, and other socioeconomic factors. As a result, the estimated emissions for each of

  7. Talking to Iraq and Afghanistan war veterans about tobacco use.

    PubMed

    Widome, Rachel; Joseph, Anne M; Polusny, Melissa A; Chlebeck, Bernadette; Brock, Betsy; Gulden, Ashley; Fu, Steven S

    2011-07-01

    Our goal in this study was to examine beliefs and attitudes about tobacco use in the newest generation of combat veterans, those who served in Afghanistan (Operation Enduring Freedom [OEF]) and Iraq (Operation Iraqi Freedom [OIF]). We held 5 focus groups (n = 17) with Minnesota Army National Guard soldiers who had recently returned from combat deployment in support of OEF/OIF. Sessions were audiorecorded, transcribed, coded, and analyzed using a grounded theory approach. We found that it is common to use tobacco in the combat zone for stress and anger management and boredom relief. Tobacco was also a tool for staying alert, a way to socialize, and provided a chance to take breaks. Participants recognized the culture of tobacco use in the military. Stress, nicotine dependence, the tobacco environment at drill activities, and perceived inaccessibility of cessation tools perpetuated use at home and served as a barrier to cessation. Repeatedly, participants cited tobacco policies (such as increased taxes and smoke-free workspaces) as motivators for quitting. There are specific circumstances common to combat zones that promote tobacco use. Results suggest that environmental changes that address the prominence of tobacco in military culture, the acceptance of nonsmoking breaks, and cessation programs that address stress issues and make cessation aids available may be effective in reducing tobacco use.

  8. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    PubMed

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  9. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  10. Pressing needs of biomedical text mining in biocuration and beyond: opportunities and challenges.

    PubMed

    Singhal, Ayush; Leaman, Robert; Catlett, Natalie; Lemberger, Thomas; McEntyre, Johanna; Polson, Shawn; Xenarios, Ioannis; Arighi, Cecilia; Lu, Zhiyong

    2016-01-01

    Text mining in the biomedical sciences is rapidly transitioning from small-scale evaluation to large-scale application. In this article, we argue that text-mining technologies have become essential tools in real-world biomedical research. We describe four large scale applications of text mining, as showcased during a recent panel discussion at the BioCreative V Challenge Workshop. We draw on these applications as case studies to characterize common requirements for successfully applying text-mining techniques to practical biocuration needs. We note that system 'accuracy' remains a challenge and identify several additional common difficulties and potential research directions including (i) the 'scalability' issue due to the increasing need of mining information from millions of full-text articles, (ii) the 'interoperability' issue of integrating various text-mining systems into existing curation workflows and (iii) the 'reusability' issue on the difficulty of applying trained systems to text genres that are not seen previously during development. We then describe related efforts within the text-mining community, with a special focus on the BioCreative series of challenge workshops. We believe that focusing on the near-term challenges identified in this work will amplify the opportunities afforded by the continued adoption of text-mining tools. Finally, in order to sustain the curation ecosystem and have text-mining systems adopted for practical benefits, we call for increased collaboration between text-mining researchers and various stakeholders, including researchers, publishers and biocurators. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  11. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  12. Fueling industrial biotechnology growth with bioethanol.

    PubMed

    Otero, José Manuel; Panagiotou, Gianni; Olsson, Lisbeth

    2007-01-01

    Industrial biotechnology is the conversion of biomass via biocatalysis, microbial fermentation, or cell culture to produce chemicals, materials, and/or energy. Industrial biotechnology processes aim to be cost-competitive, environmentally favorable, and self-sustaining compared to their petrochemical equivalents. Common to all processes for the production of energy, commodity, added value, or fine chemicals is that raw materials comprise the most significant cost fraction, particularly as operating efficiencies increase through practice and improving technologies. Today, crude petroleum represents the dominant raw material for the energy and chemical sectors worldwide. Within the last 5 years petroleum prices, stability, and supply have increased, decreased, and been threatened, respectively, driving a renewed interest across academic, government, and corporate centers to utilize biomass as an alternative raw material. Specifically, bio-based ethanol as an alternative biofuel has emerged as the single largest biotechnology commodity, with close to 46 billion L produced worldwide in 2005. Bioethanol is a leading example of how systems biology tools have significantly enhanced metabolic engineering, inverse metabolic engineering, and protein and enzyme engineering strategies. This enhancement stems from method development for measurement, analysis, and data integration of functional genomics, including the transcriptome, proteome, metabolome, and fluxome. This review will show that future industrial biotechnology process development will benefit tremendously from the precedent set by bioethanol - that enabling technologies (e.g., systems biology tools) coupled with favorable economic and socio-political driving forces do yield profitable, sustainable, and environmentally responsible processes. Biofuel will continue to be the keystone of any industrial biotechnology-based economy whereby biorefineries leverage common raw materials and unit operations to integrate diverse processes to produce demand-driven product portfolios.

  13. Commercial Molecular Tests for Fungal Diagnosis from a Practical Point of View.

    PubMed

    Lackner, Michaela; Lass-Flörl, Cornelia

    2017-01-01

    The increasing interest in molecular diagnostics is a result of tremendously improved knowledge on fungal infections in the past 20 years and the rapid development of new methods, in particular polymerase chain reaction. High expectations have been placed on molecular diagnostics, and the number of laboratories now using the relevant technology is rapidly increasing-resulting in an obvious need for standardization and definition of laboratory organization. In the past 10 years, multiple new molecular tools were marketed for the detection of DNA, antibodies, cell wall components, or other antigens. In contrast to classical culture methods, molecular methods do not detect a viable organisms, but only molecules which indicate its presence; this can be nucleic acids, cell components (antigens), or antibodies (Fig. 1). In this chapter, an overview is provided on commercially available detection tools, their strength and how to use them. A main focus is laid on providing tips and tricks that make daily life easier. We try to focus and mention methodical details which are not highlighted in the manufacturer's instructions of these test kits, but are based on our personal experience in the laboratory. Important to keep in mind is that molecular tools cannot replace culture, microscopy, or a critical view on patients' clinical history, signs, and symptoms, but provide a valuable add on tool. Diagnosis should not be based solely on a molecular test, but molecular tools might deliver an important piece of information that helps matching the diagnostic puzzle to a diagnosis, in particular as few tests are in vitro diagnostic tests (IVD) or only part of the whole test carries the IVD certificate (e.g., DNA extraction is often not included). Please be aware that the authors do not claim to provide a complete overview on all commercially available diagnostic assays being currently marketed for fungal detection, as those are subject to constant change. A main focus is put on commonly used panfungal assays and pathogen-specific assays, including Aspergillus-specific, Candida-specific, Cryptococcus specific, Histoplasma-specific, and Pneumocystis-specific assays. Assays are categorized according to their underlying principle in either antigen-detecting or antibody-detecting or DNA-detecting (Fig. 1). Other non-DNA-detecting nucleic acid methods such as FISH and PNA FISH are not summarized in this chapter and an overview on test performance, common false positives, and the clinical evaluation of commercial tests in studies is provided already in a previous book series by Javier Yugueros Marcos and David H. Pincus (Marcos and Pincus, Methods Mol Biol 968:25-54, 2013).

  14. A Comparative Study of Interval Management Control Law Capabilities

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Smith, Colin L.; Palmer, Susan O.; Abbott, Terence S.

    2012-01-01

    This paper presents a new tool designed to allow for rapid development and testing of different control algorithms for airborne spacing. This tool, Interval Management Modeling and Spacing Tool (IM MAST), is a fast-time, low-fidelity tool created to model the approach of aircraft to a runway, with a focus on their interactions with each other. Errors can be induced between pairs of aircraft by varying initial positions, winds, speed profiles, and altitude profiles. Results to-date show that only a few of the algorithms tested had poor behavior in the arrival and approach environment. The majority of the algorithms showed only minimal variation in performance under the test conditions. Trajectory-based algorithms showed high susceptibility to wind forecast errors, while performing marginally better than the other algorithms under other conditions. Trajectory-based algorithms have a sizable advantage, however, of being able to perform relative spacing operations between aircraft on different arrival routes and flight profiles without employing ghosting. methods. This comes at the higher cost of substantially increased complexity, however. Additionally, it was shown that earlier initiation of relative spacing operations provided more time for corrections to be made without any significant problems in the spacing operation itself. Initiating spacing farther out, however, would require more of the aircraft to begin spacing before they merge onto a common route.

  15. BuddySuite: Command-Line Toolkits for Manipulating Sequences, Alignments, and Phylogenetic Trees.

    PubMed

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-06-01

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite_wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution 2017. This work is written by US Government employees and is in the public domain in the US.

  16. Dissemination and implementation of an educational tool for veterans on complementary and alternative medicine: a case study.

    PubMed

    Held, Rachel Forster; Santos, Susan; Marki, Michelle; Helmer, Drew

    2016-09-02

    We developed and disseminated an educational DVD to introduce U.S. Veterans to independently-practiced complementary and alternative medicine (CAM) techniques and encourage CAM experimentation. The project's goal was to determine optimal dissemination methods to facilitate implementation within the Veteran's Health Administration. In the first phase, the DVD was disseminated using four methods: passive, provider-mediated, active, and peer-mediated. In the second, implementation phase, "champion" providers who supported CAM integrated dissemination into clinical practice. Qualitative data came from Veteran focus groups and semi-structured provider interviews. Data from both phases was triangulated to identify common themes. Effective dissemination requires engaging patients. Providers who most successfully integrated the DVD into practice already had CAM knowledge, and worked in settings where CAM was accepted clinical practice, or with leadership or infrastructure that supported a culture of CAM use. Institutional buy-in allowed for provider networking and effective implementation of the tool. Providers were given autonomy to determine the most appropriate dissemination strategies, which increased enthusiasm and use. Many of the lessons learned from this project can be applied to dissemination of any new educational tool within a healthcare setting. Results reiterate the importance of utilizing best practices for introducing educational tools within the healthcare context and the need for thoughtful, multi-faceted dissemination strategies.

  17. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2018 update.

    PubMed

    Afgan, Enis; Baker, Dannon; Batut, Bérénice; van den Beek, Marius; Bouvier, Dave; Cech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Grüning, Björn A; Guerler, Aysam; Hillman-Jackson, Jennifer; Hiltemann, Saskia; Jalili, Vahid; Rasche, Helena; Soranzo, Nicola; Goecks, Jeremy; Taylor, James; Nekrutenko, Anton; Blankenberg, Daniel

    2018-05-22

    Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.

  18. GPS-Lipid: a robust tool for the prediction of multiple lipid modification sites.

    PubMed

    Xie, Yubin; Zheng, Yueyuan; Li, Hongyu; Luo, Xiaotong; He, Zhihao; Cao, Shuo; Shi, Yi; Zhao, Qi; Xue, Yu; Zuo, Zhixiang; Ren, Jian

    2016-06-16

    As one of the most common post-translational modifications in eukaryotic cells, lipid modification is an important mechanism for the regulation of variety aspects of protein function. Over the last decades, three classes of lipid modifications have been increasingly studied. The co-regulation of these different lipid modifications is beginning to be noticed. However, due to the lack of integrated bioinformatics resources, the studies of co-regulatory mechanisms are still very limited. In this work, we developed a tool called GPS-Lipid for the prediction of four classes of lipid modifications by integrating the Particle Swarm Optimization with an aging leader and challengers (ALC-PSO) algorithm. GPS-Lipid was proven to be evidently superior to other similar tools. To facilitate the research of lipid modification, we hosted a publicly available web server at http://lipid.biocuckoo.org with not only the implementation of GPS-Lipid, but also an integrative database and visualization tool. We performed a systematic analysis of the co-regulatory mechanism between different lipid modifications with GPS-Lipid. The results demonstrated that the proximal dual-lipid modifications among palmitoylation, myristoylation and prenylation are key mechanism for regulating various protein functions. In conclusion, GPS-lipid is expected to serve as useful resource for the research on lipid modifications, especially on their co-regulation.

  19. Developing a framework for digital objects in the Big Data to Knowledge (BD2K) commons: Report from the Commons Framework Pilots workshop.

    PubMed

    Jagodnik, Kathleen M; Koplev, Simon; Jenkins, Sherry L; Ohno-Machado, Lucila; Paten, Benedict; Schurer, Stephan C; Dumontier, Michel; Verborgh, Ruben; Bui, Alex; Ping, Peipei; McKenna, Neil J; Madduri, Ravi; Pillai, Ajay; Ma'ayan, Avi

    2017-07-01

    The volume and diversity of data in biomedical research have been rapidly increasing in recent years. While such data hold significant promise for accelerating discovery, their use entails many challenges including: the need for adequate computational infrastructure, secure processes for data sharing and access, tools that allow researchers to find and integrate diverse datasets, and standardized methods of analysis. These are just some elements of a complex ecosystem that needs to be built to support the rapid accumulation of these data. The NIH Big Data to Knowledge (BD2K) initiative aims to facilitate digitally enabled biomedical research. Within the BD2K framework, the Commons initiative is intended to establish a virtual environment that will facilitate the use, interoperability, and discoverability of shared digital objects used for research. The BD2K Commons Framework Pilots Working Group (CFPWG) was established to clarify goals and work on pilot projects that address existing gaps toward realizing the vision of the BD2K Commons. This report reviews highlights from a two-day meeting involving the BD2K CFPWG to provide insights on trends and considerations in advancing Big Data science for biomedical research in the United States. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Classification and assessment tools for structural motif discovery algorithms.

    PubMed

    Badr, Ghada; Al-Turaiki, Isra; Mathkour, Hassan

    2013-01-01

    Motif discovery is the problem of finding recurring patterns in biological data. Patterns can be sequential, mainly when discovered in DNA sequences. They can also be structural (e.g. when discovering RNA motifs). Finding common structural patterns helps to gain a better understanding of the mechanism of action (e.g. post-transcriptional regulation). Unlike DNA motifs, which are sequentially conserved, RNA motifs exhibit conservation in structure, which may be common even if the sequences are different. Over the past few years, hundreds of algorithms have been developed to solve the sequential motif discovery problem, while less work has been done for the structural case. In this paper, we survey, classify, and compare different algorithms that solve the structural motif discovery problem, where the underlying sequences may be different. We highlight their strengths and weaknesses. We start by proposing a benchmark dataset and a measurement tool that can be used to evaluate different motif discovery approaches. Then, we proceed by proposing our experimental setup. Finally, results are obtained using the proposed benchmark to compare available tools. To the best of our knowledge, this is the first attempt to compare tools solely designed for structural motif discovery. Results show that the accuracy of discovered motifs is relatively low. The results also suggest a complementary behavior among tools where some tools perform well on simple structures, while other tools are better for complex structures. We have classified and evaluated the performance of available structural motif discovery tools. In addition, we have proposed a benchmark dataset with tools that can be used to evaluate newly developed tools.

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. The Blooming Anatomy Tool (BAT): A Discipline-Specific Rubric for Utilizing Bloom's Taxonomy in the Design and Evaluation of Assessments in the Anatomical Sciences

    ERIC Educational Resources Information Center

    Thompson, Andrew R.; O'Loughlin, Valerie D.

    2015-01-01

    Bloom's taxonomy is a resource commonly used to assess the cognitive level associated with course assignments and examination questions. Although widely utilized in educational research, Bloom's taxonomy has received limited attention as an analytical tool in the anatomical sciences. Building on previous research, the Blooming Anatomy Tool (BAT)…

  9. A “walker” tool to place Diaphorina citri (Hemiptera: Liviidae) adults at predetermined sites for bioassays of behavior in citrus (Sapindales: Rutacease) trees

    USDA-ARS?s Scientific Manuscript database

    A walker tool was developed to assist placement of D. citri on citrus host trees in behavioral bioassays. The walker performs better than a commonly used paintbrush tool in the proportion of successful placements and in the reduction of jumps away from the citrus leaf, although it takes about two mi...

  10. Visual Reasoning Tools in Action: Double Number Lines, Area Models, and Other Diagrams Power Up Students' Ability to Solve and Make Sense of Various Problems

    ERIC Educational Resources Information Center

    Watanabe, Tad

    2015-01-01

    The Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) identifies the strategic use of appropriate tools as one of the mathematical practices and emphasizes the use of pictures and diagrams as reasoning tools. Starting with the early elementary grades, CCSSM discusses students' solving of problems "by drawing." In later…

  11. Making Scientific Data Usable and Useful

    NASA Astrophysics Data System (ADS)

    Satwicz, T.; Bharadwaj, A.; Evans, J.; Dirks, J.; Clark Cole, K.

    2017-12-01

    Transforming geological data into information that has broad scientific and societal impact is a process fraught with barriers. Data sets and tools are often reported to have poor user experiences (UX) that make scientific work more challenging than it needs be. While many other technical fields have benefited from ongoing improvements to the UX of their tools (e.g., healthcare and financial services) scientists are faced with using tools that are labor intensive and not intuitive. Our research team has been involved in a multi-year effort to understand and improve the UX of scientific tools and data sets. We use a User-Centered Design (UCD) process that involves naturalistic behavioral observation and other qualitative research methods adopted from Human-Computer Interaction (HCI) and related fields. Behavioral observation involves having users complete common tasks on data sets, tools, and websites to identify usability issues and understand the severity of the issues. We measure how successfully they complete tasks and diagnosis the cause of any failures. Behavioral observation is paired with in-depth interviews where users describe their process for generating results (from initial inquiry to final results). By asking detailed questions we unpack common patterns and challenges scientists experience while working with data. We've found that tools built using the UCD process can have a large impact on scientist work flows and greatly reduce the time it takes to process data before analysis. It is often challenging to understand the organization and nuances of data across scientific fields. By better understanding how scientists work we can create tools that make routine tasks less-labor intensive, data easier to find, and solve common issues with discovering new data sets and engaging in interdisciplinary research. There is a tremendous opportunity for advancing scientific knowledge and helping the public benefit from that work by creating intuitive, interactive, and powerful tools and resources for generating knowledge. The pathway to achieving that is through building a detailed understanding of users and their needs, then using this knowledge to inform the design of the data products, tools, and services scientists and non-scientists use to do their work.

  12. PDA usage and training: targeting curriculum for residents and faculty.

    PubMed

    Morris, Carl G; Church, Lili; Vincent, Chris; Rao, Ashwin

    2007-06-01

    Utilization of personal digital assistants (PDAs) in residency education is common, but information about their use and how residents are trained to use them is limited. Better understanding of resident and faculty PDA use and training is needed. We used a cross-sectional survey of 598 residents and faculty from the WWAMI (Washington, Wyoming, Alaska, Montana, and Idaho) Family Medicine Residency Network regarding PDA usage and training. Use of PDAs is common among residents (94%) and faculty (79%). Ninety-six percent of faculty and residents report stable or increasing frequency of use over time. The common barriers to PDA use relate to lack of time, knowledge, and formal education. Approximately half of PDA users (52%) have received some formal training; however, the majority of users report being self-taught. Faculty and residents prefer either small-group or one-on-one settings with hands-on, self-directed, interactive formats for PDA training. Large-group settings in lecture, written, or computer program formats were considered less helpful or desirable. PDAs have become a commonly used clinical tool. Lack of time and adequate training present a barrier to optimal application of PDAs in family medicine residency education.

  13. The Potential of Digital Technologies to Support Literacy Instruction Relevant to the Common Core State Standards

    ERIC Educational Resources Information Center

    Hutchison, Amy C.; Colwell, Jamie

    2014-01-01

    Digital tools have the potential to transform instruction and promote literacies outlined in the Common Core State Standards. Empirical research is examined to illustrate this potential in grades 6-12 instruction.

  14. Common Effects Methodology National Stakeholder Meeting December 1, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  15. Visual system manifestations of Alzheimer's disease.

    PubMed

    Kusne, Yael; Wolf, Andrew B; Townley, Kate; Conway, Mandi; Peyman, Gholam A

    2017-12-01

    Alzheimer's disease (AD) is an increasingly common disease with massive personal and economic costs. While it has long been known that AD impacts the visual system, there has recently been an increased focus on understanding both pathophysiological mechanisms that may be shared between the eye and brain and how related biomarkers could be useful for AD diagnosis. Here, were review pertinent cellular and molecular mechanisms of AD pathophysiology, the presence of AD pathology in the visual system, associated functional changes, and potential development of diagnostic tools based on the visual system. Additionally, we discuss links between AD and visual disorders, including possible pathophysiological mechanisms and their relevance for improving our understanding of AD. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  16. Biochemical markers for prediction of preclampsia: review of the literature

    PubMed Central

    Monte, Santo

    2011-01-01

    Preeclampsia (PE) is one of the most common diseases worldwide, complicating ~5% of all pregnancies. Although no major progress has been achieved in the treatment of PE, our ability to identify women at highrisk has increased considerably during the past decade. The early identification of patients with an increased risk for preeclampsia is therefore one of the most important goals in obstetrics. Today, several markers may offer the potential to be used, most likely in a combinatory analysis, as predictors or diagnostic tools. We present here the current knowledge on the biology of preeclampsia and review several biochemical markers which may be used to monitor preeclampsia in a future, that, we hope, is not to distant from today. PMID:22439080

  17. Biochemical markers for prediction of preclampsia: review of the literature.

    PubMed

    Monte, Santo

    2011-07-01

    Preeclampsia (PE) is one of the most common diseases worldwide, complicating ~5% of all pregnancies.Although no major progress has been achieved in the treatment of PE, our ability to identify women at highrisk has increased considerably during the past decade.The early identification of patients with an increased risk for preeclampsia is therefore one of the most important goals in obstetrics. Today, several markers may offer the potential to be used, most likely in a combinatory analysis, as predictors or diagnostic tools. We present here the current knowledge on the biology of preeclampsia and review several biochemical markers which may be used to monitor preeclampsia in a future, that, we hope, is not to distant from today.

  18. Editorial: Biological Engagement Programs: Reducing Threats and Strengthening Global Health Security Through Scientific Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fair, Jeanne M.

    It is often said about infectious diseases that a “threat anywhere is a threat everywhere,” and the recent outbreaks of Ebola in West Africa and Zika virus in South America have proven that pathogens know no borders. Not only are they transboundary, pathogens do not discriminate who they infect. In addition to the natural increase in emerging zoonotic infectious diseases worldwide due to changing environmental conditions and globalization, the use of infectious diseases as warfare agents is a threat in today’s world. Early detection remains one of the best ways to prevent small outbreaks becoming epidemics and pandemics. We findmore » that an accurate diagnosis, detection, and reporting of diseases are important components of mitigating outbreaks, and biosurveillance remains the top tool in our toolbox. And while vaccines have been important for controlling more common infectious virus diseases, they are less feasible for less common diseases, emerging pathogens, and rapidly evolving microbes. Furthermore, due to globalization and increased travel, emigration, and migration, biosurveillance is critical throughout the world, not just in pockets of more developed regions.« less

  19. Random measurement error: Why worry? An example of cardiovascular risk factors.

    PubMed

    Brakenhoff, Timo B; van Smeden, Maarten; Visseren, Frank L J; Groenwold, Rolf H H

    2018-01-01

    With the increased use of data not originally recorded for research, such as routine care data (or 'big data'), measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error) is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate). For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  20. Editorial: Biological Engagement Programs: Reducing Threats and Strengthening Global Health Security Through Scientific Collaboration

    DOE PAGES

    Fair, Jeanne M.

    2017-07-12

    It is often said about infectious diseases that a “threat anywhere is a threat everywhere,” and the recent outbreaks of Ebola in West Africa and Zika virus in South America have proven that pathogens know no borders. Not only are they transboundary, pathogens do not discriminate who they infect. In addition to the natural increase in emerging zoonotic infectious diseases worldwide due to changing environmental conditions and globalization, the use of infectious diseases as warfare agents is a threat in today’s world. Early detection remains one of the best ways to prevent small outbreaks becoming epidemics and pandemics. We findmore » that an accurate diagnosis, detection, and reporting of diseases are important components of mitigating outbreaks, and biosurveillance remains the top tool in our toolbox. And while vaccines have been important for controlling more common infectious virus diseases, they are less feasible for less common diseases, emerging pathogens, and rapidly evolving microbes. Furthermore, due to globalization and increased travel, emigration, and migration, biosurveillance is critical throughout the world, not just in pockets of more developed regions.« less

  1. Creative Commons and Why It Should Be More Commonly Understood

    ERIC Educational Resources Information Center

    Johnson, Doug

    2009-01-01

    Authors, videographers, musicians, photographers, and almost anyone who creates materials and makes them publicly available has an alternative to standard copyright licensing: Creative Commons (CC). It is a tool that helps the creator display a licensing mark. The creator can assign a variety of rights for others to use his work--rights that are…

  2. Computational Systems Biology Approach Predicts Regulators and Targets of microRNAs and Their Genomic Hotspots in Apoptosis Process.

    PubMed

    Alanazi, Ibrahim O; Ebrahimie, Esmaeil

    2016-07-01

    Novel computational systems biology tools such as common targets analysis, common regulators analysis, pathway discovery, and transcriptomic-based hotspot discovery provide new opportunities in understanding of apoptosis molecular mechanisms. In this study, after measuring the global contribution of microRNAs in the course of apoptosis by Affymetrix platform, systems biology tools were utilized to obtain a comprehensive view on the role of microRNAs in apoptosis process. Network analysis and pathway discovery highlighted the crosstalk between transcription factors and microRNAs in apoptosis. Within the transcription factors, PRDM1 showed the highest upregulation during the course of apoptosis, with more than 9-fold expression increase compared to non-apoptotic condition. Within the microRNAs, MIR1208 showed the highest expression in non-apoptotic condition and downregulated by more than 6 fold during apoptosis. Common regulators algorithm showed that TNF receptor is the key upstream regulator with a high number of regulatory interactions with the differentially expressed microRNAs. BCL2 and AKT1 were the key downstream targets of differentially expressed microRNAs. Enrichment analysis of the genomic locations of differentially expressed microRNAs led us to the discovery of chromosome bands which were highly enriched (p < 0.01) with the apoptosis-related microRNAs, such as 13q31.3, 19p13.13, and Xq27.3 This study opens a new avenue in understanding regulatory mechanisms and downstream functions in the course of apoptosis as well as distinguishing genomic-enriched hotspots for apoptosis process.

  3. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  4. RNA therapeutics targeting osteoclast-mediated excessive bone resorption

    PubMed Central

    Wang, Yuwei; Grainger, David W

    2011-01-01

    RNA interference (RNAi) is a sequence-specific post-transcriptional gene silencing technique developed with dramatically increasing utility for both scientific and therapeutic purposes. Short interfering RNA (siRNA) is currently exploited to regulate protein expression relevant to many therapeutic applications, and commonly used as a tool for elucidating disease-associated genes. Osteoporosis and their associated osteoporotic fragility fractures in both men and women are rapidly becoming a global healthcare crisis as average life expectancy increases worldwide. New therapeutics are needed for this increasing patient population. This review describes the diversity of molecular targets suitable for RNAi-based gene knock-down in osteoclasts to control osteoclast-mediated excessive bone resorption. We identify strategies for developing targeted siRNA delivery and efficient gene silencing, and describe opportunities and challenges of introducing siRNA as a therapeutic approach to hard and connective tissue disorders. PMID:21945356

  5. Effects on Text Simplification: Evaluation of Splitting up Noun Phrases

    PubMed Central

    Leroy, Gondy; Kauchak, David; Hogue, Alan

    2016-01-01

    To help increase health literacy, we are developing a text simplification tool that creates more accessible patient education materials. Tool development is guided by data-driven feature analysis comparing simple and difficult text. In the present study, we focus on the common advice to split long noun phrases. Our previous corpus analysis showed that easier texts contained shorter noun phrases. Subsequently, we conduct a user study to measure the difficulty of sentences containing noun phrases of different lengths (2-gram, 3-gram and 4-gram), conditions (split or not) and, to simulate unknown terms, use of pseudowords (present or not). We gathered 35 evaluations for 30 sentences in each condition (3×2×2 conditions) on Amazon’s Mechanical Turk (N=12,600). We conducted a three-way ANOVA for perceived and actual difficulty. Splitting noun phrases had a positive effect on perceived difficulty but a negative effect on actual difficulty. The presence of pseudowords increased perceived and actual difficulty. Without pseudowords, longer noun phrase led to increased perceived and actual difficulty. A follow-up study using the phrases (N = 1,350) showed that measuring awkwardness may indicate when to split noun phrases. We conclude that splitting noun phrases benefits perceived difficulty, but hurts actual difficulty when the phrasing becomes less natural. PMID:27043754

  6. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  7. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...

  8. RNA-seq Data: Challenges in and Recommendations for Experimental Design and Analysis.

    PubMed

    Williams, Alexander G; Thomas, Sean; Wyman, Stacia K; Holloway, Alisha K

    2014-10-01

    RNA-seq is widely used to determine differential expression of genes or transcripts as well as identify novel transcripts, identify allele-specific expression, and precisely measure translation of transcripts. Thoughtful experimental design and choice of analysis tools are critical to ensure high-quality data and interpretable results. Important considerations for experimental design include number of replicates, whether to collect paired-end or single-end reads, sequence length, and sequencing depth. Common analysis steps in all RNA-seq experiments include quality control, read alignment, assigning reads to genes or transcripts, and estimating gene or transcript abundance. Our aims are two-fold: to make recommendations for common components of experimental design and assess tool capabilities for each of these steps. We also test tools designed to detect differential expression, since this is the most widespread application of RNA-seq. We hope that these analyses will help guide those who are new to RNA-seq and will generate discussion about remaining needs for tool improvement and development. Copyright © 2014 John Wiley & Sons, Inc.

  9. AHCODA-DB: a data repository with web-based mining tools for the analysis of automated high-content mouse phenomics data.

    PubMed

    Koopmans, Bastijn; Smit, August B; Verhage, Matthijs; Loos, Maarten

    2017-04-04

    Systematic, standardized and in-depth phenotyping and data analyses of rodent behaviour empowers gene-function studies, drug testing and therapy design. However, no data repositories are currently available for standardized quality control, data analysis and mining at the resolution of individual mice. Here, we present AHCODA-DB, a public data repository with standardized quality control and exclusion criteria aimed to enhance robustness of data, enabled with web-based mining tools for the analysis of individually and group-wise collected mouse phenotypic data. AHCODA-DB allows monitoring in vivo effects of compounds collected from conventional behavioural tests and from automated home-cage experiments assessing spontaneous behaviour, anxiety and cognition without human interference. AHCODA-DB includes such data from mutant mice (transgenics, knock-out, knock-in), (recombinant) inbred strains, and compound effects in wildtype mice and disease models. AHCODA-DB provides real time statistical analyses with single mouse resolution and versatile suite of data presentation tools. On March 9th, 2017 AHCODA-DB contained 650 k data points on 2419 parameters from 1563 mice. AHCODA-DB provides users with tools to systematically explore mouse behavioural data, both with positive and negative outcome, published and unpublished, across time and experiments with single mouse resolution. The standardized (automated) experimental settings and the large current dataset (1563 mice) in AHCODA-DB provide a unique framework for the interpretation of behavioural data and drug effects. The use of common ontologies allows data export to other databases such as the Mouse Phenome Database. Unbiased presentation of positive and negative data obtained under the highly standardized screening conditions increase cost efficiency of publicly funded mouse screening projects and help to reach consensus conclusions on drug responses and mouse behavioural phenotypes. The website is publicly accessible through https://public.sylics.com and can be viewed in every recent version of all commonly used browsers.

  10. Middle-aged women's decisions about body weight management: needs assessment and testing of a knowledge translation tool.

    PubMed

    Stacey, Dawn; Jull, Janet; Beach, Sarah; Dumas, Alex; Strychar, Irene; Adamo, Kristi; Brochu, Martin; Prud'homme, Denis

    2015-04-01

    This study aims to assess middle-aged women's needs when making body weight management decisions and to evaluate a knowledge translation tool for addressing their needs. A mixed-methods study used an interview-guided theory-based survey of professional women aged 40 to 65 years. The tool summarized evidence to address their needs and enabled women to monitor actions taken. Acceptability and usability were reported descriptively. Sixty female participants had a mean body mass index of 28.0 kg/m(2) (range, 17.0-44.9 kg/m(2)), and half were premenopausal. Common options for losing (82%) or maintaining (18%) weight included increasing physical activity (60%), eating healthier (57%), and getting support (40%). Decision-making involved getting information on options (52%), soliciting others' decisions/advice (20%), and being self-motivated (20%). Preferred information sources included written information (97%), counseling (90%), and social networking websites (43%). Five professionals (dietitian, personal trainer, occupational therapist, and two physicians) had similar responses. Of 53 women sent the tool, 27 provided acceptability feedback. They rated it as good to excellent for information on menopause (96%), body weight changes (85%), and managing body weight (85%). Most would tell others about it (81%). After 4 weeks of use, 25 women reported that the wording made sense (96%) and that the tool had clear instructions (92%) and was easy to use across time (88%). The amount of information was rated as just right (64%), but the tool had limited space for responding (72%). When making decisions about body weight management, women's needs were "getting information" and "getting support." The knowledge translation tool was acceptable and usable, but further evaluation is required.

  11. Middle-aged women’s decisions about body weight management: needs assessment and testing of a knowledge translation tool

    PubMed Central

    Stacey, Dawn; Jull, Janet; Beach, Sarah; Dumas, Alex; Strychar, Irene; Adamo, Kristi; Brochu, Martin; Prud’homme, Denis

    2015-01-01

    Abstract Objective This study aims to assess middle-aged women’s needs when making body weight management decisions and to evaluate a knowledge translation tool for addressing their needs. Methods A mixed-methods study used an interview-guided theory-based survey of professional women aged 40 to 65 years. The tool summarized evidence to address their needs and enabled women to monitor actions taken. Acceptability and usability were reported descriptively. Results Sixty female participants had a mean body mass index of 28.0 kg/m2 (range, 17.0-44.9 kg/m2), and half were premenopausal. Common options for losing (82%) or maintaining (18%) weight included increasing physical activity (60%), eating healthier (57%), and getting support (40%). Decision-making involved getting information on options (52%), soliciting others’ decisions/advice (20%), and being self-motivated (20%). Preferred information sources included written information (97%), counseling (90%), and social networking websites (43%). Five professionals (dietitian, personal trainer, occupational therapist, and two physicians) had similar responses. Of 53 women sent the tool, 27 provided acceptability feedback. They rated it as good to excellent for information on menopause (96%), body weight changes (85%), and managing body weight (85%). Most would tell others about it (81%). After 4 weeks of use, 25 women reported that the wording made sense (96%) and that the tool had clear instructions (92%) and was easy to use across time (88%). The amount of information was rated as just right (64%), but the tool had limited space for responding (72%). Conclusions When making decisions about body weight management, women’s needs were “getting information” and “getting support.” The knowledge translation tool was acceptable and usable, but further evaluation is required. PMID:25816120

  12. Common Effects Methodology National Stakeholder Meeting December 1, 2010 White Papers

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  13. Common Effects Methodology Regional Stakeholder Meeting January 11 -22, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  14. The Virtual Physiological Human ToolKit.

    PubMed

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  15. Automated Diabetic Retinopathy Screening and Monitoring Using Retinal Fundus Image Analysis.

    PubMed

    Bhaskaranand, Malavika; Ramachandra, Chaithanya; Bhat, Sandeep; Cuadros, Jorge; Nittala, Muneeswar Gupta; Sadda, SriniVas; Solanki, Kaushal

    2016-02-16

    Diabetic retinopathy (DR)-a common complication of diabetes-is the leading cause of vision loss among the working-age population in the western world. DR is largely asymptomatic, but if detected at early stages the progression to vision loss can be significantly slowed. With the increasing diabetic population there is an urgent need for automated DR screening and monitoring. To address this growing need, in this article we discuss an automated DR screening tool and extend it for automated estimation of microaneurysm (MA) turnover, a potential biomarker for DR risk. The DR screening tool automatically analyzes color retinal fundus images from a patient encounter for the various DR pathologies and collates the information from all the images belonging to a patient encounter to generate a patient-level screening recommendation. The MA turnover estimation tool aligns retinal images from multiple encounters of a patient, localizes MAs, and performs MA dynamics analysis to evaluate new, persistent, and disappeared lesion maps and estimate MA turnover rates. The DR screening tool achieves 90% sensitivity at 63.2% specificity on a data set of 40 542 images from 5084 patient encounters obtained from the EyePACS telescreening system. On a subset of 7 longitudinal pairs the MA turnover estimation tool identifies new and disappeared MAs with 100% sensitivity and average false positives of 0.43 and 1.6 respectively. The presented automated tools have the potential to address the growing need for DR screening and monitoring, thereby saving vision of millions of diabetic patients worldwide. © 2016 Diabetes Technology Society.

  16. Digital Tools to Enhance Clinical Reasoning.

    PubMed

    Manesh, Reza; Dhaliwal, Gurpreet

    2018-05-01

    Physicians can improve their diagnostic acumen by adopting a simulation-based approach to analyzing published cases. The tight coupling of clinical problems and their solutions affords physicians the opportunity to efficiently upgrade their illness scripts (structured knowledge of a specific disease) and schemas (structured frameworks for common problems). The more times clinicians practice accessing and applying those knowledge structures through published cases, the greater the odds that they will have an enhanced approach to similar patient-cases in the future. This article highlights digital resources that increase the number of cases a clinician experiences and learns from. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Advanced tools for the analysis of protein phosphorylation in yeast mitochondria.

    PubMed

    Walter, Corvin; Gonczarowska-Jorge, Humberto; Sickmann, Albert; Zahedi, René P; Meisinger, Chris; Schmidt, Oliver

    2018-05-24

    The biochemical analysis of protein phosphorylation in mitochondria lags behind that of cytosolic signaling events. One reason is the poor stability of many phosphorylation sites during common isolation procedures for mitochondria. We present here an optimized, fast protocol for the purification of yeast mitochondria that greatly increases recovery of phosphorylated mitochondrial proteins. Moreover, we describe improved protocols for the biochemical analysis of mitochondrial protein phosphorylation by Zn 2+ -Phos-tag electrophoresis under both denaturing and - for the first time - native conditions, and demonstrate that they outperform previously applied methods. Copyright © 2018. Published by Elsevier Inc.

  18. Polymers and biopolymers at interfaces

    NASA Astrophysics Data System (ADS)

    Hall, A. R.; Geoghegan, M.

    2018-03-01

    This review updates recent progress in the understanding of the behaviour of polymers at surfaces and interfaces, highlighting examples in the areas of wetting, dewetting, crystallization, and ‘smart’ materials. Recent developments in analysis tools have yielded a large increase in the study of biological systems, and some of these will also be discussed, focussing on areas where surfaces are important. These areas include molecular binding events and protein adsorption as well as the mapping of the surfaces of cells. Important techniques commonly used for the analysis of surfaces and interfaces are discussed separately to aid the understanding of their application.

  19. Extending the boundaries of reverse engineering

    NASA Astrophysics Data System (ADS)

    Lawrie, Chris

    2002-04-01

    In today's market place the potential of Reverse Engineering as a time compression tool is commonly lost under its traditional definition. The term Reverse Engineering was coined way back at the advent of CMM machines and 3D CAD systems to describe the process of fitting surfaces to captured point data. Since these early beginnings, downstream hardware scanning and digitising systems have evolved in parallel with an upstream demand, greatly increasing the potential of a point cloud data set within engineering design and manufacturing processes. The paper will discuss the issues surrounding Reverse Engineering at the turn of the millennium.

  20. HLH Rotor Blade Manufacturing Technology Development Report

    DTIC Science & Technology

    1977-09-01

    30 Tool Design and Fabrication . . . . . .. 30 Tool Concepts and Materials . . . . . . . 30 Autoclave Cure - Plastic Molds . . . 30...Materials Autoclave Cure - Plastic MoiJ.- The Double Coke Bottle specimen (Figure 13) was layed-up on a bean bag and cured in a fiberglass tool in...lower airfoil) was made from a foam material, mounted on a common base, and covered with plastic coating to give a hard working surface. This is

  1. Prostate Upgrading Team Project — EDRN Public Portal

    Cancer.gov

    Aim 1: We will develop a risk assessment tool using commonly-collected clinical information from a series of contemporary radical prostatectomies to predict the risk of prostate cancer upgrading to high grade cancer at radical prostatectomy. These data will be combined as a part of our Early Detection Research Network (EDRN) GU Working Group into a risk assessment tool; this tool will be named the EDRN Prostatectomy Upgrading Calculator or (EPUC).

  2. A Comparative Case Study Analysis of Administrators Perceptions on the Adaptation of Quality and Continuous Improvement Tools to Community Colleges in the State of Michigan

    ERIC Educational Resources Information Center

    Mattis, Ted B.

    2011-01-01

    The purpose of this study was to determine whether community college administrators in the state of Michigan believe that commonly known quality and continuous improvement tools, prevalent in a manufacturing environment, can be adapted to a community college model. The tools, specifically Six Sigma, benchmarking and process mapping have played a…

  3. Lymph Node Metastases Optical Molecular Diagnostic and Radiation Therapy

    DTIC Science & Technology

    2017-03-01

    structures and not molecular functions. The one tool commonly used for metastases imaging is nuclear medicine. Positron emission tomography, PET, is...be visualized at a relevant stage., largely because most imaging is based upon structures and not molecular functions. But there are no tools to...system suitable for imaging signals from in small animals on the standard radiation therapy tools. (3) To evaluate the limits on structural , metabolic

  4. Image Processing Using a Parallel Architecture.

    DTIC Science & Technology

    1987-12-01

    ENG/87D-25 Abstract This study developed a set o± low level image processing tools on a parallel computer that allows concurrent processing of images...environment, the set of tools offers a significant reduction in the time required to perform some commonly used image processing operations. vI IMAGE...step toward developing these systems, a structured set of image processing tools was implemented using a parallel computer. More important than

  5. Management Tools for Bus Maintenance: Current Practices and New Methods. Final Report.

    ERIC Educational Resources Information Center

    Foerster, James; And Others

    Management of bus fleet maintenance requires systematic recordkeeping, management reporting, and work scheduling procedures. Tools for controlling and monitoring routine maintenance activities are in common use. These include defect and fluid consumption reports, work order systems, historical maintenance records, and performance and cost…

  6. Gestural Imitation and Limb Apraxia in Corticobasal Degeneration

    ERIC Educational Resources Information Center

    Salter, Jennifer E.; Roy, Eric A.; Black, Sandra E.; Joshi, Anish; Almeida, Quincy

    2004-01-01

    Limb apraxia is a common symptom of corticobasal degeneration (CBD). While previous research has shown that individuals with CBD have difficulty imitating transitive (tool-use actions) and intransitive non-representational gestures (nonsense actions), intransitive representational gestures (actions without a tool) have not been examined. In the…

  7. The use of grounded theory in studies of nurses and midwives' coping processes: a systematic literature search.

    PubMed

    Cheer, Karen; MacLaren, David; Tsey, Komla

    2015-01-01

    Researchers are increasingly using grounded theory methodologies to study the professional experience of nurses and midwives. To review common grounded theory characteristics and research design quality as described in grounded theory studies of coping strategies used by nurses and midwives. A systematic database search for 2005-2015 identified and assessed grounded theory characteristics from 16 studies. Study quality was assessed using a modified Critical Appraisal Skills Programme tool. Grounded theory was considered a methodology or a set of methods, able to be used within different nursing and midwifery contexts. Specific research requirements determined the common grounded theory characteristics used in different studies. Most researchers did not clarify their epistemological and theoretical perspectives. To improve research design and trustworthiness of grounded theory studies in nursing and midwifery, researchers need to state their theoretical stance and clearly articulate their use of grounded theory methodology and characteristics in research reporting.

  8. The role of radiology in diagnosis and management of drug mules: an update with new challenges and new diagnostic tools

    PubMed Central

    Cengel, Ferhat

    2016-01-01

    Emergency physicians and radiologists have been increasingly encountering internal concealment of illegal drugs. The packages commonly contain powdered solid drugs such as cocaine, heroin, methamphetamine and hashish, but they may also contain cocaine in the liquid form. The second type of package has recently been more commonly encountered, and poses a greater diagnostic challenge. As clinical evaluation and laboratory tests frequently fail to make the correct diagnosis, imaging examination is typically required. Imaging methods assume a vital role in the diagnosis, follow-up and management. Abdominal X-ray, ultrasonography, CT and MRI are used for the imaging purposes. Among the aforementioned methods, low-dose CT is state-of-the-art in these cases. It is of paramount importance that radiologists have a full knowledge of the imaging characteristics of these packages and accurately guide physicians and security officials. PMID:26867003

  9. Zeroing in on Number and Operations, Grades 7-8: Key Ideas and Common Misconceptions

    ERIC Educational Resources Information Center

    Collins, Anne; Dacey, Linda

    2010-01-01

    "The Zeroing in on Number and Operations" series, which aligns with the Common Core State Standards and the NCTM Standards and Focal Points, features easy-to-use tools for teaching key concepts in number and operations and for addressing common misconceptions. Sharing the insights they've gained in decades of mathematics teaching and research,…

  10. Zeroing in on Number and Operations, Grades 3-4: Key Ideas and Common Misconceptions

    ERIC Educational Resources Information Center

    Dacey, Linda; Collins, Anne

    2010-01-01

    "The Zeroing in on Number and Operations" series, which aligns with the Common Core State Standards and the NCTM Standards and Focal Points, features easy-to-use tools for teaching key concepts in number and operations and for addressing common misconceptions. Sharing the insights they've gained in decades of mathematics teaching and research,…

  11. Zeroing in on Number and Operations, Grades 5-6: Key Ideas and Common Misconceptions

    ERIC Educational Resources Information Center

    Collins, Anne; Dacey, Linda

    2010-01-01

    "The Zeroing in on Number and Operations" series, which aligns with the Common Core State Standards and the NCTM Sandards and Focal Points, features easy-to-use tools for teaching key concepts in number and operations and for addressing common misconceptions. Sharing the insights they've gained through decades of mathematics teaching and research,…

  12. Airplane numerical simulation for the rapid prototyping process

    NASA Astrophysics Data System (ADS)

    Roysdon, Paul F.

    Airplane Numerical Simulation for the Rapid Prototyping Process is a comprehensive research investigation into the most up-to-date methods for airplane development and design. Uses of modern engineering software tools, like MatLab and Excel, are presented with examples of batch and optimization algorithms which combine the computing power of MatLab with robust aerodynamic tools like XFOIL and AVL. The resulting data is demonstrated in the development and use of a full non-linear six-degrees-of-freedom simulator. The applications for this numerical tool-box vary from un-manned aerial vehicles to first-order analysis of manned aircraft. A Blended-Wing-Body airplane is used for the analysis to demonstrate the flexibility of the code from classic wing-and-tail configurations to less common configurations like the blended-wing-body. This configuration has been shown to have superior aerodynamic performance -- in contrast to their classic wing-and-tube fuselage counterparts -- and have reduced sensitivity to aerodynamic flutter as well as potential for increased engine noise abatement. Of course without a classic tail elevator to damp the nose up pitching moment, and the vertical tail rudder to damp the yaw and possible rolling aerodynamics, the challenges in lateral roll and yaw stability, as well as pitching moment are not insignificant. This thesis work applies the tools necessary to perform the airplane development and optimization on a rapid basis, demonstrating the strength of this tool through examples and comparison of the results to similar airplane performance characteristics published in literature.

  13. Ultrasonic grinding of optical materials

    NASA Astrophysics Data System (ADS)

    Cahill, Michael; Bechtold, Michael; Fess, Edward; Stephan, Thomas; Bechtold, Rob

    2017-10-01

    Hard ceramic optical materials such as sapphire, ALON, Spinel, PCA, or Silicon Carbide can present a significant challenge in manufacturing precision optical components due to their tough mechanical properties. These are also the same mechanical properties that make them desirable materials when used in harsh environments. Slow processing speeds, premature tool wear, and poor surface quality are common results of the tough mechanical properties of these materials. Often, as a preparatory stage for polishing, the finish of the ground surface greatly influences the polishing process and the resulting finished product. To overcome these challenges, OptiPro Systems has developed an ultrasonic assisted grinding technology, OptiSonic, which has been designed for the precision optics and ceramics industry. OptiSonic utilizes a custom tool holder designed to produce oscillations, in microns of amplitude, in line with the rotating spindle. A software package, IntelliSonic, is integral to the function of this platform. IntelliSonic can automatically characterize tooling during setup to identify and select the ideal resonant peak which to operate at. Then, while grinding, IntelliSonic continuously adjusts the output frequency for optimal grinding efficiency while in contact with the part. This helps maintain a highly consistent process under changing load conditions for a more precise surface. Utilizing a variety of instruments, tests have proven to show a reduction in force between tool and part by up to 50%, while increasing the surface quality and reducing tool wear. This paper will present the challenges associated with these materials and solutions created to overcome them.

  14. Analysis Of The Surface Roughness Obtained During The Dry Turning Of UNS A97050-T7 Aluminium Alloys

    NASA Astrophysics Data System (ADS)

    de Agustina, B.; Rubio, E. M.; Villeta, M.; Sebastián, M. A.

    2009-11-01

    Currently, in the aeronautical, aerospace and automotive industries there is high demand of materials such as the aluminium alloys that have high resistance even at high temperatures as well as a low density. For this reason, these alloys are widely used for the production of different elements that compose aircraft and aerospace vehicles. Nevertheless, in spite of the important role these materials have from the competitive point of view, they can commonly show problems of machinability associated with the tool wear. That has made that traditionally cutting fluids had been used in machining processes. However, they can contain environmentally harmful constituents and increase considerably the total cost of the process. Therefore, researches have been focused on the development of cleaner production technologies applications as dry machining. This leads to the search for combinations of cutting parameters and type of tools (types of coatings and different geometries) that could improve the machining under such conditions. The aim of this study is to analyse the relationship between the surface roughness obtained during the dry turning of aluminium UNS A97050-T7 bars and the cutting parameters (cutting speed and feed) using three different tools. As a first conclusion it could be affirmed that the feed was the cutting parameter more influential on the surface roughness and to a lesser extend the cutting speed, the type of tool and the interaction between the type of tool and the feed.

  15. Designing Health Information Technology Tools to Prevent Gaps in Public Health Insurance.

    PubMed

    Hall, Jennifer D; Harding, Rose L; DeVoe, Jennifer E; Gold, Rachel; Angier, Heather; Sumic, Aleksandra; Nelson, Christine A; Likumahuwa-Ackman, Sonja; Cohen, Deborah J

    2017-06-23

    Changes in health insurance policies have increased coverage opportunities, but enrollees are required to annually reapply for benefits which, if not managed appropriately, can lead to insurance gaps. Electronic health records (EHRs) can automate processes for assisting patients with health insurance enrollment and re-enrollment. We describe community health centers' (CHC) workflow, documentation, and tracking needs for assisting families with insurance application processes, and the health information technology (IT) tool components that were developed to meet those needs. We conducted a qualitative study using semi-structured interviews and observation of clinic operations and insurance application assistance processes. Data were analyzed using a grounded theory approach. We diagramed workflows and shared information with a team of developers who built the EHR-based tools. Four steps to the insurance assistance workflow were common among CHCs: 1) Identifying patients for public health insurance application assistance; 2) Completing and submitting the public health insurance application when clinic staff met with patients to collect requisite information and helped them apply for benefits; 3) Tracking public health insurance approval to monitor for decisions; and 4) assisting with annual health insurance reapplication. We developed EHR-based tools to support clinical staff with each of these steps. CHCs are uniquely positioned to help patients and families with public health insurance applications. CHCs have invested in staff to assist patients with insurance applications and help prevent coverage gaps. To best assist patients and to foster efficiency, EHR based insurance tools need comprehensive, timely, and accurate health insurance information.

  16. Novel 1H low field nuclear magnetic resonance applications for the field of biodiesel

    PubMed Central

    2013-01-01

    Background Biodiesel production has increased dramatically over the last decade, raising the need for new rapid and non-destructive analytical tools and technologies. 1H Low Field Nuclear Magnetic Resonance (LF-NMR) applications, which offer great potential to the field of biodiesel, have been developed by the Phyto Lipid Biotechnology Lab research team in the last few years. Results Supervised and un-supervised chemometric tools are suggested for screening new alternative biodiesel feedstocks according to oil content and viscosity. The tools allowed assignment into viscosity groups of biodiesel-petrodiesel samples whose viscosity is unknown, and uncovered biodiesel samples that have residues of unreacted acylglycerol and/or methanol, and poorly separated and cleaned glycerol and water. In the case of composite materials, relaxation time distribution, and cross-correlation methods were successfully applied to differentiate components. Continuous distributed methods were also applied to calculate the yield of the transesterification reaction, and thus monitor the progress of the common and in-situ transesterification reactions, offering a tool for optimization of reaction parameters. Conclusions Comprehensive applied tools are detailed for the characterization of new alternative biodiesel resources in their whole conformation, monitoring of the biodiesel transesterification reaction, and quality evaluation of the final product, using a non-invasive and non-destructive technology that is new to the biodiesel research area. A new integrated computational-experimental approach for analysis of 1H LF-NMR relaxometry data is also presented, suggesting improved solution stability and peak resolution. PMID:23590829

  17. Screening for substance abuse risk in cancer patients using the Opioid Risk Tool and urine drug screen.

    PubMed

    Barclay, Joshua S; Owens, Justine E; Blackhall, Leslie J

    2014-07-01

    The use of opioids for management of cancer-related pain has increased significantly and has been associated with a substantial rise in rates of substance abuse and diversion. There is a paucity of data not only on the prevalence of substance abuse in cancer patients, but also for issues of drug use and diversion in family caregivers. This study aimed to evaluate the frequency of risk factors for substance abuse and diversion, and abnormal urine drug screens in cancer patients receiving palliative care. A retrospective chart review was performed for patients with cancer who were seen in the University of Virginia Palliative Care Clinic during the month of September 2012. We evaluated Opioid Risk Tool variables and total scores, insurance status, and urine drug screen results. Of the 114 cancer patients seen in September 2012, the mean Opioid Risk Tool score was 3.79, with 43% of patients defined as medium to high risk. Age (16-45 years old, 23%) and a personal history of alcohol (23%) or illicit drugs (21%) were the most common risk factors identified. We obtained a urine drug screen on 40% of patients, noting abnormal findings in 45.65%. Opioids are an effective treatment for cancer-related pain, yet substantial risk for substance abuse exits in the cancer population. Screening tools, such as the Opioid Risk Tool, should be used as part of a complete patient assessment to balance risk with appropriate relief of suffering.

  18. a Standardized Approach to Topographic Data Processing and Workflow Management

    NASA Astrophysics Data System (ADS)

    Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.

    2013-12-01

    An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and sharing effective topographic processing workflows.

  19. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    NASA Astrophysics Data System (ADS)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such tools can provide the additional advantage of enhancing cohesion and communication across specific research areas, and reducing research obstacles in a range of disciplines.

  20. An evaluation of copy number variation detection tools for cancer using whole exome sequencing data.

    PubMed

    Zare, Fatima; Dow, Michelle; Monteleone, Nicholas; Hosny, Abdelrahman; Nabavi, Sheida

    2017-05-31

    Recently copy number variation (CNV) has gained considerable interest as a type of genomic/genetic variation that plays an important role in disease susceptibility. Advances in sequencing technology have created an opportunity for detecting CNVs more accurately. Recently whole exome sequencing (WES) has become primary strategy for sequencing patient samples and study their genomics aberrations. However, compared to whole genome sequencing, WES introduces more biases and noise that make CNV detection very challenging. Additionally, tumors' complexity makes the detection of cancer specific CNVs even more difficult. Although many CNV detection tools have been developed since introducing NGS data, there are few tools for somatic CNV detection for WES data in cancer. In this study, we evaluated the performance of the most recent and commonly used CNV detection tools for WES data in cancer to address their limitations and provide guidelines for developing new ones. We focused on the tools that have been designed or have the ability to detect cancer somatic aberrations. We compared the performance of the tools in terms of sensitivity and false discovery rate (FDR) using real data and simulated data. Comparative analysis of the results of the tools showed that there is a low consensus among the tools in calling CNVs. Using real data, tools show moderate sensitivity (~50% - ~80%), fair specificity (~70% - ~94%) and poor FDRs (~27% - ~60%). Also, using simulated data we observed that increasing the coverage more than 10× in exonic regions does not improve the detection power of the tools significantly. The limited performance of the current CNV detection tools for WES data in cancer indicates the need for developing more efficient and precise CNV detection methods. Due to the complexity of tumors and high level of noise and biases in WES data, employing advanced novel segmentation, normalization and de-noising techniques that are designed specifically for cancer data is necessary. Also, CNV detection development suffers from the lack of a gold standard for performance evaluation. Finally, developing tools with user-friendly user interfaces and visualization features can enhance CNV studies for a broader range of users.

  1. Developing Cancer Informatics Applications and Tools Using the NCI Genomic Data Commons API.

    PubMed

    Wilson, Shane; Fitzsimons, Michael; Ferguson, Martin; Heath, Allison; Jensen, Mark; Miller, Josh; Murphy, Mark W; Porter, James; Sahni, Himanso; Staudt, Louis; Tang, Yajing; Wang, Zhining; Yu, Christine; Zhang, Junjun; Ferretti, Vincent; Grossman, Robert L

    2017-11-01

    The NCI Genomic Data Commons (GDC) was launched in 2016 and makes available over 4 petabytes (PB) of cancer genomic and associated clinical data to the research community. This dataset continues to grow and currently includes over 14,500 patients. The GDC is an example of a biomedical data commons, which collocates biomedical data with storage and computing infrastructure and commonly used web services, software applications, and tools to create a secure, interoperable, and extensible resource for researchers. The GDC is (i) a data repository for downloading data that have been submitted to it, and also a system that (ii) applies a common set of bioinformatics pipelines to submitted data; (iii) reanalyzes existing data when new pipelines are developed; and (iv) allows users to build their own applications and systems that interoperate with the GDC using the GDC Application Programming Interface (API). We describe the GDC API and how it has been used both by the GDC itself and by third parties. Cancer Res; 77(21); e15-18. ©2017 AACR . ©2017 American Association for Cancer Research.

  2. Abrasive Wear Resistance of Tool Steels Evaluated by the Pin-on-Disc Testing

    NASA Astrophysics Data System (ADS)

    Bressan, José Divo; Schopf, Roberto Alexandre

    2011-05-01

    Present work examines tool steels abrasion wear resistance and the abrasion mechanisms which are one main contributor to failure of tooling in metal forming industry. Tooling used in cutting and metal forming processes without lubrication fails due to this type of wear. In the workshop and engineering practice, it is common to relate wear resistance as function of material hardness only. However, there are others parameters which influences wear such as: fracture toughness, type of crystalline structure and the occurrence of hard precipitate in the metallic matrix and also its nature. In the present investigation, the wear mechanisms acting in tool steels were analyzed and, by normalized tests, wear resistance performance of nine different types of tool steels were evaluated by pin-on-disc testing. Conventional tool steels commonly used in tooling such as AISI H13 and AISI A2 were compared in relation to tool steels fabricated by sintering process such as Crucible CPM 3V, CPM 9V and M4 steels. Friction and wear testing were carried out in a pin-on-disc automated equipment which pin was tool steel and the counter-face was a abrasive disc of silicon carbide. Normal load of 5 N, sliding velocity of 0.45 m/s, total sliding distance of 3000 m and room temperature were employed. The wear rate was calculated by the Archard's equation and from the plotted graphs of pin cumulated volume loss versus sliding distance. Specimens were appropriately heat treated by quenching and three tempering cycles. Percentage of alloying elements, metallographic analyses of microstructure and Vickers microhardness of specimens were performed, analyzed and correlated with wear rate. The work is concluded by the presentation of a rank of tool steel wear rate, comparing the different tool steel abrasion wear resistance: the best tool steel wear resistance evaluated was the Crucible CPM 9V steel.

  3. A Common Variation in Deiodinase 1 Gene DIO1 Is Associated with the Relative Levels of Free Thyroxine and Triiodothyronine

    PubMed Central

    Panicker, Vijay; Cluett, Christie; Shields, Beverley; Murray, Anna; Parnell, Kirstie S.; Perry, John R. B.; Weedon, Michael N.; Singleton, Andrew; Hernandez, Dena; Evans, Jonathan; Durant, Claire; Ferrucci, Luigi; Melzer, David; Saravanan, Ponnusamy; Visser, Theo J.; Ceresini, Graziano; Hattersley, Andrew T.; Vaidya, Bijay; Dayan, Colin M.; Frayling, Timothy M.

    2008-01-01

    Introduction: Genetic factors influence circulating thyroid hormone levels, but the common gene variants involved have not been conclusively identified. The genes encoding the iodothyronine deiodinases are good candidates because they alter the balance of thyroid hormones. We aimed to thoroughly examine the role of common variation across the three deiodinase genes in relation to thyroid hormones. Methods: We used HapMap data to select single-nucleotide polymorphisms (SNPs) that captured a large proportion of the common genetic variation across the three deiodinase genes. We analyzed these initially in a cohort of 552 people on T4 replacement. Suggestive findings were taken forward into three additional studies in people not on T4 (total n = 2513) and metaanalyzed for confirmation. Results: A SNP in the DIO1 gene, rs2235544, was associated with the free T3 to free T4 ratio with genome-wide levels of significance (P = 3.6 × 10−13). The C-allele of this SNP was associated with increased deiodinase 1 (D1) function with resulting increase in free T3/T4 ratio and free T3 and decrease in free T4 and rT3. There was no effect on serum TSH levels. None of the SNPs in the genes coding for D2 or D3 had any influence on hormone levels. Conclusions: This study provides convincing evidence that common genetic variation in DIO1 alters deiodinase function, resulting in an alteration in the balance of circulating free T3 to free T4. This should prove a valuable tool to assess the relative effects of circulating free T3 vs. free T4 on a wide range of biological parameters. PMID:18492748

  4. A common variation in deiodinase 1 gene DIO1 is associated with the relative levels of free thyroxine and triiodothyronine.

    PubMed

    Panicker, Vijay; Cluett, Christie; Shields, Beverley; Murray, Anna; Parnell, Kirstie S; Perry, John R B; Weedon, Michael N; Singleton, Andrew; Hernandez, Dena; Evans, Jonathan; Durant, Claire; Ferrucci, Luigi; Melzer, David; Saravanan, Ponnusamy; Visser, Theo J; Ceresini, Graziano; Hattersley, Andrew T; Vaidya, Bijay; Dayan, Colin M; Frayling, Timothy M

    2008-08-01

    Genetic factors influence circulating thyroid hormone levels, but the common gene variants involved have not been conclusively identified. The genes encoding the iodothyronine deiodinases are good candidates because they alter the balance of thyroid hormones. We aimed to thoroughly examine the role of common variation across the three deiodinase genes in relation to thyroid hormones. We used HapMap data to select single-nucleotide polymorphisms (SNPs) that captured a large proportion of the common genetic variation across the three deiodinase genes. We analyzed these initially in a cohort of 552 people on T(4) replacement. Suggestive findings were taken forward into three additional studies in people not on T(4) (total n = 2513) and metaanalyzed for confirmation. A SNP in the DIO1 gene, rs2235544, was associated with the free T(3) to free T(4) ratio with genome-wide levels of significance (P = 3.6 x 10(-13)). The C-allele of this SNP was associated with increased deiodinase 1 (D1) function with resulting increase in free T(3)/T(4) ratio and free T(3) and decrease in free T(4) and rT(3). There was no effect on serum TSH levels. None of the SNPs in the genes coding for D2 or D3 had any influence on hormone levels. This study provides convincing evidence that common genetic variation in DIO1 alters deiodinase function, resulting in an alteration in the balance of circulating free T(3) to free T(4). This should prove a valuable tool to assess the relative effects of circulating free T(3) vs. free T(4) on a wide range of biological parameters.

  5. A Bayesian approach to the creation of a study-customized neonatal brain atlas

    PubMed Central

    Zhang, Yajing; Chang, Linda; Ceritoglu, Can; Skranes, Jon; Ernst, Thomas; Mori, Susumu; Miller, Michael I.; Oishi, Kenichi

    2014-01-01

    Atlas-based image analysis (ABA), in which an anatomical “parcellation map” is used for parcel-by-parcel image quantification, is widely used to analyze anatomical and functional changes related to brain development, aging, and various diseases. The parcellation maps are often created based on common MRI templates, which allow users to transform the template to target images, or vice versa, to perform parcel-by-parcel statistics, and report the scientific findings based on common anatomical parcels. The use of a study-specific template, which represents the anatomical features of the study population better than common templates, is preferable for accurate anatomical labeling; however, the creation of a parcellation map for a study-specific template is extremely labor intensive, and the definitions of anatomical boundaries are not necessarily compatible with those of the common template. In this study, we employed a Volume-based Template Estimation (VTE) method to create a neonatal brain template customized to a study population, while keeping the anatomical parcellation identical to that of a common MRI atlas. The VTE was used to morph the standardized parcellation map of the JHU-neonate-SS atlas to capture the anatomical features of a study population. The resultant “study-customized” T1-weighted and diffusion tensor imaging (DTI) template, with three-dimensional anatomical parcellation that defined 122 brain regions, was compared with the JHU-neonate-SS atlas, in terms of the registration accuracy. A pronounced increase in the accuracy of cortical parcellation and superior tensor alignment were observed when the customized template was used. With the customized atlas-based analysis, the fractional anisotropy (FA) detected closely approximated the manual measurements. This tool provides a solution for achieving normalization-based measurements with increased accuracy, while reporting scientific findings in a consistent framework. PMID:25026155

  6. Tools for Practical Psychotherapy: A Transtheoretical Collection (or Interventions Which Have, At Least, Worked for Us).

    PubMed

    Yager, Joel; Feinstein, Robert E

    2017-01-01

    Regardless of their historical and theoretical roots, strategies, tactics, and techniques used in everyday psychotherapy across diverse theoretical schools contain common factors and methods from other specific psychotherapeutic modalities that contribute substantially to psychotherapy outcomes. Common factors include alliance, empathy, goal consensus/collaboration, positive regard/affirmation, and congruence/genuineness, among others. All therapies also recognize that factors specific to therapists impact treatment. Starting with these common factors, we add psychotherapeutic methods from many theoretical orientations to create a collection of clinical tools. We then provide concrete suggestions for enacting psychotherapy interventions, which constitute a transtheoretical collection. We begin with observations made by earlier scholars, our combined clinical and teaching experiences, and oral traditions and clinical pearls passed down from our own supervisors and mentors. We have compiled a list of tools for students with foundational knowledge in the basic forms of psychotherapy, which may expand their use of additional interventions for practicing effective psychotherapy. Our toolbox is organized into 4 categories: Relating; Exploring; Explaining; and Intervening. We note how these tools correspond to items previously published in a list of core psychotherapy competencies. In our view, the toolbox can be used most judiciously by students and practitioners schooled and grounded in frameworks for conducting established psychotherapies. Although they are still a work in progress, these tools can authorize and guide trainees and practitioners to enact specific approaches to psychotherapy utilizing other frameworks. We believe that psychotherapy education and training might benefit from explicitly focusing on the application of such interventions.

  7. Clinical utility of gene expression profiling data for clinical decision-making regarding adjuvant therapy in early stage, node-negative breast cancer: a case report.

    PubMed

    Schuster, Steven R; Pockaj, Barbara A; Bothe, Mary R; David, Paru S; Northfelt, Donald W

    2012-09-10

    Breast cancer is the most common malignancy among women in the United States with the second highest incidence of cancer-related death following lung cancer. The decision-making process regarding adjuvant therapy is a time intensive dialogue between the patient and her oncologist. There are multiple tools that help individualize the treatment options for a patient. Population-based analysis with Adjuvant! Online and genomic profiling with Oncotype DX are two commonly used tools in patients with early stage, node-negative breast cancer. This case report illustrates a situation in which the population-based prognostic and predictive information differed dramatically from that obtained from genomic profiling and affected the patient's decision. In light of this case, we discuss the benefits and limitations of these tools.

  8. Targeted disruption of sp7 and myostatin with CRISPR-Cas9 results in severe bone defects and more muscular cells in common carp

    PubMed Central

    Zhong, Zhaomin; Niu, Pengfei; Wang, Mingyong; Huang, Guodong; Xu, Shuhao; Sun, Yi; Xu, Xiaona; Hou, Yi; Sun, Xiaowen; Yan, Yilin; Wang, Han

    2016-01-01

    The common carp (Cyprinus carpio) as one of the most important aquaculture fishes produces over 3 million metric tones annually, approximately 10% the annual production of the all farmed freshwater fish worldwide. However, the tetraploidy genome and long generation-time of the common carp have made its breeding and genetic studies extremely difficult. Here, TALEN and CRISPR-Cas9, two versatile genome-editing tools, are employed to target common carp bone-related genes sp7, runx2, bmp2a, spp1, opg, and muscle suppressor gene mstn. TALEN were shown to induce mutations in the target coding sites of sp7, runx2, spp1 and mstn. With CRISPR-Cas9, the two common carp sp7 genes, sp7a and sp7b, were mutated individually, all resulting in severe bone defects; while mstnba mutated fish have grown significantly more muscle cells. We also employed CRISPR-Cas9 to generate double mutant fish of sp7a;mstnba with high efficiencies in a single step. These results demonstrate that both TALEN and CRISPR-Cas9 are highly efficient tools for modifying the common carp genome, and open avenues for facilitating common carp genetic studies and breeding. PMID:26976234

  9. e-Health readiness assessment factors and measuring tools: A systematic review.

    PubMed

    Yusif, Salifu; Hafeez-Baig, Abdul; Soar, Jeffrey

    2017-11-01

    The evolving, adoption and high failure nature of health information technology (HIT)/IS/T systems requires effective readiness assessment to avert increasing failures while increasing system benefits. However, literature on HIT readiness assessment is myriad and fragmented. This review bares the contours of the available literature concluding in a set of manageable and usable recommendations for policymakers, researchers, individuals and organizations intending to assess readiness for any HIT implementation. Identify studies, analyze readiness factors and offer recommendations. Published articles 1995-2016 were searched using Medline/PubMed, Cinahl, Web of Science, PsychInfo, ProQuest. Studies were included if they were assessing IS/T/mHealth readiness in the context of HIT. Articles not written in English were excluded. Themes that emerged in the process of the data synthesis were thematically analysed and interpreted. Analyzed themes were found across 63 articles. In accordance with their prevalence of use, they included but not limited to "Technological readiness", 30 (46%); "Core/Need/Motivational readiness", 23 (37%); "Acceptance and use readiness", 19 (29%); "Organizational readiness", 20 (21%); "IT skills/Training/Learning readiness" (18%), "Engagement readiness", 16 (24%) and "Societal readiness" (14%). Despite their prevalence in use, "Technological readiness", "Motivational readiness" and "Engagement readiness" all had myriad and unreliable measuring tools. Core readiness had relatively reliable measuring tools, which repeatedly been used in various readiness assessment studies CONCLUSION: Thus, there is the need for reliable measuring tools for even the most commonly used readiness assessment factors/constructs: Core readiness, Engagement and buy-ins readiness, Technological readiness and IT Skills readiness as this could serve as an immediate step in conducting effective/reliable e-Health readiness assessment, which could lead to reduced HIT implementation failures. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Testing the woman abuse screening tool to identify intimate partner violence in Indonesia.

    PubMed

    Iskandar, Livia; Braun, Kathryn L; Katz, Alan R

    2015-04-01

    Intimate Partner Violence (IPV) is a global public health problem. IPV prevalence in Indonesia has been estimated to be less than 1%, based on reported cases. It is likely that IPV prevalence is underreported in Indonesia, as it is in many other countries. Screening for IPV has been found to increase IPV identification, but no screening tools are in use in Indonesia. The aim of this study was to test the translated Woman Abuse Screening Tool (WAST) for detecting IPV in Indonesia. The WAST was tested against a diagnostic interview by a trained psychologist on 240 women attending two Primary Health Centers in Jakarta. IPV prevalence and the reliability, sensitivity, and specificity of the WAST were estimated. Prevalence of IPV by diagnostic interview was 36.3%, much higher than published estimates. The most common forms of IPV identified were psychological (85%) and physical abuse (24%). Internal reliability of the WAST was high (α = .801). A WAST score of 13 (out of 24) is the recommended cutoff for identifying IPV, but only 17% of the Indonesian sample scored 13 or higher. Test sensitivity of the WAST with a cutoff score of 13 was only 41.9%, with a specificity of 96.8%. With a cutoff score of 10, the sensitivity improved to 84.9%, while the specificity decreased to 61.0%. Use of the WAST with a cutoff score of 10 provides good sensitivity and reasonable specificity and would provide a much-needed screening tool for use in Indonesia. Although a lower cutoff would yield a greater proportion of false positives, most of the true cases would be identified, increasing the possibility that women experiencing abuse would receive needed assistance. © The Author(s) 2014.

  11. Developing a research and practice tool to measure walkability: a demonstration project.

    PubMed

    Giles-Corti, Billie; Macaulay, Gus; Middleton, Nick; Boruff, Bryan; Bull, Fiona; Butterworth, Iain; Badland, Hannah; Mavoa, Suzanne; Roberts, Rebecca; Christian, Hayley

    2014-12-01

    Growing evidence shows that higher-density, mixed-use, pedestrian-friendly neighbourhoods encourage active transport, including transport-related walking. Despite widespread recognition of the benefits of creating more walkable neighbourhoods, there remains a gap between the rhetoric of the need for walkability and the creation of walkable neighbourhoods. Moreover, there is little objective data to benchmark the walkability of neighbourhoods within and between Australian cities in order to monitor planning and design intervention progress and to assess built environment and urban policy interventions required to achieve increased walkability. This paper describes a demonstration project that aimed to develop, trial and validate a 'Walkability Index Tool' that could be used by policy makers and practitioners to assess the walkability of local areas; or by researchers to access geospatial data assessing walkability. The overall aim of the project was to develop an automated geospatial tool capable of creating walkability indices for neighbourhoods at user-specified scales. The tool is based on open-source software architecture, within the Australian Urban Research Infrastructure Network (AURIN) framework, and incorporates key sub-component spatial measures of walkability (street connectivity, density and land use mix). Using state-based data, we demonstrated it was possible to create an automated walkability index. However, due to the lack of availability of consistent of national data measuring land use mix, at this stage it has not been possible to create a national walkability measure. The next stage of the project is to increase useability of the tool within the AURIN portal and to explore options for alternative spatial data sources that will enable the development of a valid national walkability index. AURIN's open-source Walkability Index Tool is a first step in demonstrating the potential benefit of a tool that could measure walkability across Australia. It also demonstrates the value of making accurate spatial data available for research purposes. SO WHAT?: There remains a gap between urban policy and practice, in terms of creating walkable neighbourhoods. When fully implemented, AURIN's walkability tool could be used to benchmark Australian cities against which planning and urban design decisions could be assessed to monitor progress towards achieving policy goals. Making cleaned data readily available for research purposes through a common portal could also save time and financial resources.

  12. Thermo-Mechanical Effect on Poly Crystalline Boron Nitride Tool Life During Friction Stir Welding (Dwell Period)

    NASA Astrophysics Data System (ADS)

    Almoussawi, M.; Smith, A. J.

    2018-05-01

    Poly Crystalline Boron Nitride (PCBN) tool wear during the friction stir welding of high melting alloys is an obstacle to commercialize the process. This work simulates the friction stir welding process and tool wear during the plunge/dwell period of 14.8 mm EH46 thick plate steel. The Computational Fluid Dynamic (CFD) model was used for simulation and the wear of the tool is estimated from temperatures and shear stress profile on the tool surface. Two sets of tool rotational speeds were applied including 120 and 200 RPM. Seven plunge/dwell samples were prepared using PCBN FSW tool, six thermocouples were also embedded around each plunge/dwell case in order to record the temperatures during the welding process. Infinite focus microscopy technique was used to create macrographs for each case. The CFD result has been shown that a shear layer around the tool shoulder and probe-side denoted as thermo-mechanical affected zone (TMAZ) was formed and its size increase with tool rotational speed increase. Maximum peak temperature was also found to increase with tool rotational speed increase. PCBN tool wear under shoulder was found to increase with tool rotational speed increase as a result of tool's binder softening after reaching to a peak temperature exceeds 1250 °C. Tool wear also found to increase at probe-side bottom as a result of high shear stress associated with the decrease in the tool rotational speed. The amount of BN particles revealed by SEM in the TMAZ were compared with the CFD model.

  13. ReportingTools: an automated result processing and presentation toolkit for high-throughput genomic analyses.

    PubMed

    Huntley, Melanie A; Larson, Jessica L; Chaivorapol, Christina; Becker, Gabriel; Lawrence, Michael; Hackney, Jason A; Kaminker, Joshua S

    2013-12-15

    It is common for computational analyses to generate large amounts of complex data that are difficult to process and share with collaborators. Standard methods are needed to transform such data into a more useful and intuitive format. We present ReportingTools, a Bioconductor package, that automatically recognizes and transforms the output of many common Bioconductor packages into rich, interactive, HTML-based reports. Reports are not generic, but have been individually designed to reflect content specific to the result type detected. Tabular output included in reports is sortable, filterable and searchable and contains context-relevant hyperlinks to external databases. Additionally, in-line graphics have been developed for specific analysis types and are embedded by default within table rows, providing a useful visual summary of underlying raw data. ReportingTools is highly flexible and reports can be easily customized for specific applications using the well-defined API. The ReportingTools package is implemented in R and available from Bioconductor (version ≥ 2.11) at the URL: http://bioconductor.org/packages/release/bioc/html/ReportingTools.html. Installation instructions and usage documentation can also be found at the above URL.

  14. VarDetect: a nucleotide sequence variation exploratory tool

    PubMed Central

    Ngamphiw, Chumpol; Kulawonganunchai, Supasak; Assawamakin, Anunchai; Jenwitheesuk, Ekachai; Tongsima, Sissades

    2008-01-01

    Background Single nucleotide polymorphisms (SNPs) are the most commonly studied units of genetic variation. The discovery of such variation may help to identify causative gene mutations in monogenic diseases and SNPs associated with predisposing genes in complex diseases. Accurate detection of SNPs requires software that can correctly interpret chromatogram signals to nucleotides. Results We present VarDetect, a stand-alone nucleotide variation exploratory tool that automatically detects nucleotide variation from fluorescence based chromatogram traces. Accurate SNP base-calling is achieved using pre-calculated peak content ratios, and is enhanced by rules which account for common sequence reading artifacts. The proposed software tool is benchmarked against four other well-known SNP discovery software tools (PolyPhred, novoSNP, Genalys and Mutation Surveyor) using fluorescence based chromatograms from 15 human genes. These chromatograms were obtained from sequencing 16 two-pooled DNA samples; a total of 32 individual DNA samples. In this comparison of automatic SNP detection tools, VarDetect achieved the highest detection efficiency. Availability VarDetect is compatible with most major operating systems such as Microsoft Windows, Linux, and Mac OSX. The current version of VarDetect is freely available at . PMID:19091032

  15. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  16. METHOD AND MEANS FOR CLOSING TUBES BY SPINNING

    DOEpatents

    Graves, E.E.; Coonfare, R.H.

    1958-08-26

    An improved spinning tool is described for producing a fold-free closed end on an aluminum jacketing tube such as is commonly used to protect a uranium fuel element. The tool will fit the toolholder of a lathe in which the jacket is rotated. The tool has a number of working faces so that the hemispherical end- closure is formed, the folds and wrinkles are smcothed out, and the excess metal is trimmed off in one transverse cutting operation. This tool considerably speeds up the closure process, and eliminates the need for a weld seal.

  17. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. An interactive tool for processing sap flux data from thermal dissipation probes

    Treesearch

    Andrew C. Oishi; Chelcy F. Miniat

    2016-01-01

    Sap flux sensors are an important tool for estimating tree-level transpiration in forested and urban ecosystems around the world. Thermal dissipation (TD) or Granier-type sap flux probes are among the most commonly used due to their reliability, simplicity, and low cost.

  4. New additions to the cancer precision medicine toolkit.

    PubMed

    Mardis, Elaine R

    2018-04-13

    New computational and database-driven tools are emerging to aid in the interpretation of cancer genomic data as its use becomes more common in clinical evidence-based cancer medicine. Two such open source tools, published recently in Genome Medicine, provide important advances to address the clinical cancer genomics data interpretation bottleneck.

  5. RISK ASSESSMENT ANALYSES USING EPA'S ON-LINE SITE-SPECIFIC TRANSPORT MODELS AND FIELD DATA

    EPA Science Inventory

    EPA has developed a suite of on-line calculators and transport models to aid in risk assessment for subsurface contamination. The calculators (www.epa.gov/athens/onsite) provide several levels of tools and data. These include tools for generating commonly-used model input param...

  6. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  7. Informal Assessment as a Tool for Supporting Parent Partnerships

    ERIC Educational Resources Information Center

    Darragh, Johnna

    2009-01-01

    Many strategies contribute to forming co-constructed relationships. However, one of the most effective tools that supports co-construction is ongoing assessment, which provides a common base (knowledge of the child) on which families and professionals can build relationships. As a part of this ongoing assessment, informal strategies--including…

  8. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    EPA Science Inventory

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  9. Homework Plans: A Tool for Promoting Independence

    ERIC Educational Resources Information Center

    Hampshire, Patricia K.; Butera, Gretchen D.; Hourcade, Jack J.

    2014-01-01

    The authors of this article discuss a well-acknowledged fact in the world of education--for many students, parents, and teachers, the word "homework" elicits feelings of dread. Although homework is common in most educational settings, not all students benefit from this learning tool, especially without careful planning and forethought.…

  10. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  11. Inclusion of Students with Significant Disabilities in SWPBS Evaluation Tools

    ERIC Educational Resources Information Center

    Kurth, Jennifer A.; Zagona, Alison; Hagiwara, Mayumi; Enyart, Matt

    2017-01-01

    Students with significant disabilities (intellectual and developmental disabilities) are predominantly educated in separate settings, and tend to have little access to schoolwide positive behavior supports (SWPBS). In this study, we first identified the most commonly cited SWPBS evaluation tools in the literature between 2010 and 2016. The SET,…

  12. A mechanistic understanding of tree responses to hinning and fertilization from stable isotopes in tree rings

    EPA Science Inventory

    Carbon sequestration has focused renewed interest in understanding how forest management affects forest carbon gain over timescales of decades. Two of the most common forest management tools are thinning and fertilization, and yet details on physiological responses to these tools...

  13. An Intelligent Agent for the K-12 Educational Community

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.; Hutchison, Mark W.; Shelton, Robert O.; Smith, Stephanie L.; Yazbeck, Marwan E.

    1995-01-01

    Nearly every professional would like to have a personal assistant to perform the physical and mental work of searching libraries for needed information. This paper describes a tool which performs this function using commonly available Internet services. The promises and difficulties of developing such a tool are described.

  14. Multimedia Instructional Tools and Student Learning in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra Laier

    2013-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology into the classroom. Multimedia instructional tools (MMIT) have been identified as a way to provide student-centered active-learning instructional material to students. MMITs are common in introductory computer applications courses based on…

  15. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  16. Development and Evaluation of the American College of Surgeons NSQIP Pediatric Surgical Risk Calculator.

    PubMed

    Kraemer, Kari; Cohen, Mark E; Liu, Yaoming; Barnhart, Douglas C; Rangel, Shawn J; Saito, Jacqueline M; Bilimoria, Karl Y; Ko, Clifford Y; Hall, Bruce L

    2016-11-01

    There is an increased desire among patients and families to be involved in the surgical decision-making process. A surgeon's ability to provide patients and families with patient-specific estimates of postoperative complications is critical for shared decision making and informed consent. Surgeons can also use patient-specific risk estimates to decide whether or not to operate and what options to offer patients. Our objective was to develop and evaluate a publicly available risk estimation tool that would cover many common pediatric surgical procedures across all specialties. American College of Surgeons NSQIP Pediatric standardized data from 67 hospitals were used to develop a risk estimation tool. Surgeons enter 18 preoperative variables (demographics, comorbidities, procedure) that are used in a logistic regression model to predict 9 postoperative outcomes. A surgeon adjustment score is also incorporated to adjust for any additional risk not accounted for in the 18 risk factors. A pediatric surgical risk calculator was developed based on 181,353 cases covering 382 CPT codes across all specialties. It had excellent discrimination for mortality (c-statistic = 0.98), morbidity (c-statistic = 0.81), and 7 additional complications (c-statistic > 0.77). The Hosmer-Lemeshow statistic and graphic representations also showed excellent calibration. The ACS NSQIP Pediatric Surgical Risk Calculator was developed using standardized and audited multi-institutional data from the ACS NSQIP Pediatric, and it provides empirically derived, patient-specific postoperative risks. It can be used as a tool in the shared decision-making process by providing clinicians, families, and patients with useful information for many of the most common operations performed on pediatric patients in the US. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Infectious disease transmission and contact networks in wildlife and livestock.

    PubMed

    Craft, Meggan E

    2015-05-26

    The use of social and contact networks to answer basic and applied questions about infectious disease transmission in wildlife and livestock is receiving increased attention. Through social network analysis, we understand that wild animal and livestock populations, including farmed fish and poultry, often have a heterogeneous contact structure owing to social structure or trade networks. Network modelling is a flexible tool used to capture the heterogeneous contacts of a population in order to test hypotheses about the mechanisms of disease transmission, simulate and predict disease spread, and test disease control strategies. This review highlights how to use animal contact data, including social networks, for network modelling, and emphasizes that researchers should have a pathogen of interest in mind before collecting or using contact data. This paper describes the rising popularity of network approaches for understanding transmission dynamics in wild animal and livestock populations; discusses the common mismatch between contact networks as measured in animal behaviour and relevant parasites to match those networks; and highlights knowledge gaps in how to collect and analyse contact data. Opportunities for the future include increased attention to experiments, pathogen genetic markers and novel computational tools. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  18. Infectious disease transmission and contact networks in wildlife and livestock

    PubMed Central

    Craft, Meggan E.

    2015-01-01

    The use of social and contact networks to answer basic and applied questions about infectious disease transmission in wildlife and livestock is receiving increased attention. Through social network analysis, we understand that wild animal and livestock populations, including farmed fish and poultry, often have a heterogeneous contact structure owing to social structure or trade networks. Network modelling is a flexible tool used to capture the heterogeneous contacts of a population in order to test hypotheses about the mechanisms of disease transmission, simulate and predict disease spread, and test disease control strategies. This review highlights how to use animal contact data, including social networks, for network modelling, and emphasizes that researchers should have a pathogen of interest in mind before collecting or using contact data. This paper describes the rising popularity of network approaches for understanding transmission dynamics in wild animal and livestock populations; discusses the common mismatch between contact networks as measured in animal behaviour and relevant parasites to match those networks; and highlights knowledge gaps in how to collect and analyse contact data. Opportunities for the future include increased attention to experiments, pathogen genetic markers and novel computational tools. PMID:25870393

  19. Visual analysis of inter-process communication for large-scale parallel computing.

    PubMed

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  20. A Review of Evidence Presented in Support of Three Key Claims in the Validity Argument for the "TextEvaluator"® Text Analysis Tool. Research Report. ETS RR-16-12

    ERIC Educational Resources Information Center

    Sheehan, Kathleen M.

    2016-01-01

    The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers and other educators select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards (CCSS). This paper provides an overview of the TextEvaluator measurement approach and…

  1. Recognition and Use of Kitchen Tools and Utensils. Learning Activity Pack and Instructor's Guide 4.4. Commercial Foods and Culinary Arts Competency-Based Series. Section 4: Equipment Handling, Operation, and Maintenance.

    ERIC Educational Resources Information Center

    Florida State Univ., Tallahassee. Center for Studies in Vocational Education.

    This document consists of a learning activity packet (LAP) for the student and an instructor's guide for the teacher. The LAP is intended to acquaint occupational home economics students with common tools and utensils used in commercial kitchens. Illustrated information sheets and learning activities are provided on various kitchen tools (cutting,…

  2. An Overview of Promising Grades of Tool Materials Based on the Analysis of their Physical-Mechanical Characteristics

    NASA Astrophysics Data System (ADS)

    Kudryashov, E. A.; Smirnov, I. M.; Grishin, D. V.; Khizhnyak, N. A.

    2018-06-01

    The work is aimed at selecting a promising grade of a tool material, whose physical-mechanical characteristics would allow using it for processing the surfaces of discontinuous parts in the presence of shock loads. An analysis of the physical-mechanical characteristics of most common tool materials is performed and the data on a possible provision of the metal-working processes with promising composite grades are presented.

  3. The Advanced Course in Professional Selling

    ERIC Educational Resources Information Center

    Loe, Terry; Inks, Scott

    2014-01-01

    More universities are incorporating sales content into their curriculums, and although the introductory courses in professional sales have much common ground and guidance from numerous professional selling texts, instructors teaching the advanced selling course lack the guidance provided by common academic tools and materials. The resulting…

  4. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less

  5. Practical solutions for staff recruitment & retention.

    PubMed

    Vander Hoek, N

    2001-01-01

    There are three essential topics for radiology managers to consider in light of persistent staffing shortages: support of the profession and educational programs, perks as recruitment tools and incentives as retention tools. Some activities that can help support departments and educational programs for radiologic technologists are job shadowing, training for volunteer services, advanced placement for school applicants, sponsoring an educational program or clinical training site, creating a positive work environment and supporting outreach projects geared to local high schools. Traditional perks used in recruitment efforts have included relocation assistance, travel and lodging expenses during the interview process, loan repayment, scholarships and sign-on bonuses. Some common incentives for retaining employees are tuition reimbursement, cross training, availability of educational resources, continuing education opportunities, professional development and incremental increases in salary. There are many other tools that can be used, such as career ladders, creating an environment conducive to teamwork or a more personal atmosphere and showcasing talents of various staff members. There is much overlap among these suggestions in support of the profession and educational programs, recruitment and retention of qualified staff radiologic technologists. Radiology managers can and should be creative in developing different programs to build loyalty and commitment to a radiology department.

  6. The Brøset Violence Checklist: clinical utility in a secure psychiatric intensive care setting.

    PubMed

    Clarke, D E; Brown, A-M; Griffith, P

    2010-09-01

    Violence towards health-care workers, especially in areas such as mental health/psychiatry, has become increasingly common, with nursing staff suggesting that a fear of violence from their patients may affect the quality of care they provide. Structured clinical tools have the potential to assist health-care providers in identifying patients who have the potential to become violent or aggressive. The Brøset Violence Checklist (BVC), a six-item instrument that uses the presence or absence of three patient characteristics and three patient behaviours to predict the potential for violence within a subsequent 24-h period, was trialled for 3 months on an 11-bed secure psychiatric intensive care unit. Despite the belief on the part of some nurses that decisions related to risk for violence and aggression rely heavily on intuition, there was widespread acceptance of the tool. During the trial, use of seclusion decreased suggesting that staff were able to intervene before seclusion was necessary. The tool has since been implemented as a routine part of patient care on two units in a 92-bed psychiatric centre. Five-year follow-up data and implications for practice are presented.

  7. Modern data-driven decision support systems: the role of computing with words and computational linguistics

    NASA Astrophysics Data System (ADS)

    Kacprzyk, Janusz; Zadrożny, Sławomir

    2010-05-01

    We present how the conceptually and numerically simple concept of a fuzzy linguistic database summary can be a very powerful tool for gaining much insight into the very essence of data. The use of linguistic summaries provides tools for the verbalisation of data analysis (mining) results which, in addition to the more commonly used visualisation, e.g. via a graphical user interface, can contribute to an increased human consistency and ease of use, notably for supporting decision makers via the data-driven decision support system paradigm. Two new relevant aspects of the analysis are also outlined which were first initiated by the authors. First, following Kacprzyk and Zadrożny, it is further considered how linguistic data summarisation is closely related to some types of solutions used in natural language generation (NLG). This can make it possible to use more and more effective and efficient tools and techniques developed in NLG. Second, similar remarks are given on relations to systemic functional linguistics. Moreover, following Kacprzyk and Zadrożny, comments are given on an extremely relevant aspect of scalability of linguistic summarisation of data, using a new concept of a conceptual scalability.

  8. Laser speckle and skin cancer: skin roughness assessment

    NASA Astrophysics Data System (ADS)

    Lee, Tim K.; Tchvialeva, Lioudmila; Zeng, Haishan; McLean, David I.; Lui, Harvey

    2009-10-01

    Incidence of skin cancer has been increasing rapidly since the last few decades. Non-invasive optical diagnostic tools may improve the diagnostic accuracy. In this paper, skin structure, skin cancer statistics and subtypes of skin cancer are briefly reviewed. Among the subtypes, malignant melanoma is the most aggressive and dangerous; early detection dramatically improves the prognosis. Therefore, a non-invasive diagnostic tool for malignant melanoma is especially needed. In addition, in order for the diagnostic tool to be useful, it must be able to differentiate melanoma from common skin conditions such as seborrheic keratosis, a benign skin disease that resembles melanoma according to the well known clinical-assessment ABCD rule. The key diagnostic feature between these two diseases is surface roughness. Based on laser speckle contrast, our research team has recently developed a portable, optical, non-invasive, in-vivo diagnostic device for quantifying skin surface roughness. The methodology of our technique is described in details. Examining the preliminary data collected in a pilot clinical study for the prototype, we found that there was a difference in roughness between melanoma and seborrheic keratosis. In fact, there was a perfect cutoff value for the two diseases based on our initial data.

  9. Dietary assessment in minority ethnic groups: a systematic review of instruments for portion-size estimation in the United Kingdom

    PubMed Central

    Almiron-Roig, Eva; Aitken, Amanda; Galloway, Catherine

    2017-01-01

    Context: Dietary assessment in minority ethnic groups is critical for surveillance programs and for implementing effective interventions. A major challenge is the accurate estimation of portion sizes for traditional foods and dishes. Objective: The aim of this systematic review was to assess records published up to 2014 describing a portion-size estimation element (PSEE) applicable to the dietary assessment of UK-residing ethnic minorities. Data sources, selection, and extraction: Electronic databases, internet sites, and theses repositories were searched, generating 5683 titles, from which 57 eligible full-text records were reviewed. Data analysis: Forty-two publications about minority ethnic groups (n = 20) or autochthonous populations (n = 22) were included. The most common PSEEs (47%) were combination tools (eg, food models and portion-size lists), followed by portion-size lists in questionnaires/guides (19%) and image-based and volumetric tools (17% each). Only 17% of PSEEs had been validated against weighed data. Conclusions: When developing ethnic-specific dietary assessment tools, it is important to consider customary portion sizes by sex and age, traditional household utensil usage, and population literacy levels. Combining multiple PSEEs may increase accuracy, but such methods require validation. PMID:28340101

  10. Arabidopsis research requires a critical re-evaluation of genetic tools.

    PubMed

    Nikonorova, Natalia; Yue, Kun; Beeckman, Tom; De Smet, Ive

    2018-06-27

    An increasing number of reports question conclusions based on loss-of-function lines that have unexpected genetic backgrounds. In this opinion paper, we urge researchers to meticulously (re)investigate phenotypes retrieved from various genetic backgrounds and be critical regarding some previously drawn conclusions. As an example, we provide new evidence that acr4-2 mutant phenotypes with respect to columella stem cells are due to the lack of ACR4 and not - at least not as a major contributor - to a mutation in QRT1. In addition, we take the opportunity to alert the scientific community about the qrt1-2 background of a large number of Syngenta Arabidopsis Insertion Library (SAIL) T-DNA lines, a feature that is not commonly recognized by Arabidopsis researchers. This qrt1-2 background might have an important impact on the interpretation of the results obtained using these research tools, now and in the past. In conclusion, as a community, we should continuously assess and - if necessary - correct our conclusions based on the large number of (genetic) tools our work is built on. In addition, the positive or negative results of this self-criticism should be made available to the scientific community.

  11. THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...

    EPA Pesticide Factsheets

    CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc

  12. DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology

    PubMed Central

    Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng

    2015-01-01

    Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437

  13. A Review of Shared Decision-Making and Patient Decision Aids in Radiation Oncology.

    PubMed

    Woodhouse, Kristina Demas; Tremont, Katie; Vachani, Anil; Schapira, Marilyn M; Vapiwala, Neha; Simone, Charles B; Berman, Abigail T

    2017-06-01

    Cancer treatment decisions are complex and may be challenging for patients, as multiple treatment options can often be reasonably considered. As a result, decisional support tools have been developed to assist patients in the decision-making process. A commonly used intervention to facilitate shared decision-making is a decision aid, which provides evidence-based outcomes information and guides patients towards choosing the treatment option that best aligns with their preferences and values. To ensure high quality, systematic frameworks and standards have been proposed for the development of an optimal aid for decision making. Studies have examined the impact of these tools on facilitating treatment decisions and improving decision-related outcomes. In radiation oncology, randomized controlled trials have demonstrated that decision aids have the potential to improve patient outcomes, including increased knowledge about treatment options and decreased decisional conflict with decision-making. This article provides an overview of the shared-decision making process and summarizes the development, validation, and implementation of decision aids as patient educational tools in radiation oncology. Finally, this article reviews the findings from decision aid studies in radiation oncology and offers various strategies to effectively implement shared decision-making into clinical practice.

  14. Community for Data Integration 2013 Annual Report

    USGS Publications Warehouse

    Chang, Michelle Y.; Carlino, Jennifer; Barnes, Christopher; Blodgett, David L.; Bock, Andrew R.; Everette, Anthony L.; Fernette, Gregory L.; Flint, Lorraine E.; Gordon, Janice M.; Govoni, David L.; Hay, Lauren E.; Henkel, Heather S.; Hines, Megan K.; Holl, Sally L.; Homer, Collin G.; Hutchison, Vivian B.; Ignizio, Drew A.; Kern, Tim J.; Lightsom, Frances L.; Markstrom, Steven L.; O'Donnell, Michael S.; Schei, Jacquelyn L.; Schmid, Lorna A.; Schoephoester, Kathryn M.; Schweitzer, Peter N.; Skagen, Susan K.; Sullivan, Daniel J.; Talbert, Colin; Warren, Meredith Pavlick

    2015-01-01

    grow overall USGS capabilities with data and information by increasing visibility of the work of many people throughout the USGS and the CDI community. To achieve these goals, the CDI operates within four applied areas: monthly forums, annual workshop/webinar series, working groups, and projects. The monthly forums, also known as the Opportunity/Challenge of the Month, provide an open dialogue to share and learn about data integration efforts or to present problems that invite the Community to offer solutions, advice, and support. Since 2010, the CDI has also sponsored annual workshops/webinar series to encourage the exchange of ideas, sharing of activities, presentations of current projects, and networking among members. Stemming from common interests, the working groups are focused on efforts to address data management and technical 2 challenges, including the development of standards and tools, improving interoperability and information infrastructure, and data preservation within USGS and its partners. The growing support for the activities of the working groups led to the CDI’s first formal request for proposals (RFP) process in 2013 to fund projects that produced tangible products. Today the CDI continues to hold an annual RFP that create data management tools and practices, collaboration tools, and training in support of data integration and delivery.

  15. Modeling post-fire hydro-geomorphic recovery in the Waldo Canyon Fire

    NASA Astrophysics Data System (ADS)

    Kinoshita, Alicia; Nourbakhshbeidokhti, Samira; Chin, Anne

    2016-04-01

    Wildfire can have significant impacts on watershed hydrology and geomorphology by changing soil properties and removing vegetation, often increasing runoff and soil erosion and deposition, debris flows, and flooding. Watershed systems may take several years or longer to recover. During this time, post-fire channel changes have the potential to alter hydraulics that influence characteristics such as time of concentration and increase time to peak flow, flow capacity, and velocity. Using the case of the 2012 Waldo Canyon Fire in Colorado (USA), this research will leverage field-based surveys and terrestrial Light Detection and Ranging (LiDAR) data to parameterize KINEROS2 (KINematic runoff and EROSion), an event oriented, physically-based watershed runoff and erosion model. We will use the Automated Geospatial Watershed Assessment (AGWA) tool, which is a GIS-based hydrologic modeling tool that uses commonly available GIS data layers to parameterize, execute, and spatially visualize runoff and sediment yield for watersheds impacted by the Waldo Canyon Fire. Specifically, two models are developed, an unburned (Bear Creek) and burned (Williams) watershed. The models will simulate burn severity and treatment conditions. Field data will be used to validate the burned watersheds for pre- and post-fire changes in infiltration, runoff, peak flow, sediment yield, and sediment discharge. Spatial modeling will provide insight into post-fire patterns for varying treatment, burn severity, and climate scenarios. Results will also provide post-fire managers with improved hydro-geomorphic modeling and prediction tools for water resources management and mitigation efforts.

  16. PaR Tensile Truss for Nuclear Decontamination and Decommissioning - 12467

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebler, Gary R.

    2012-07-01

    Remote robotics and manipulators are commonly used in nuclear decontamination and decommissioning (D and D) processes. D and D robots are often deployed using rigid telescoping masts in order to apply and counteract side loads. However, for very long vertical reaches (15 meters or longer) and high lift capacities, a telescopic is usually not practical due to the large cross section and weight required to make the mast stiff and resist seismic forces. For those long vertical travel applications, PaR Systems has recently developed the Tensile Truss, a rigid, hoist-driven 'structure' that employs six independent wire rope hoists to achievemore » long vertical reaches. Like a mast, the Tensile Truss is typically attached to a bridge-mounted trolley and is used as a platform for robotic manipulators and other remotely operated tools. For suspended, rigid deployment of D and D tools with very long vertical reaches, the Tensile Truss can be a better alternative than a telescoping mast. Masts have length limitations that can make them impractical or unworkable as lengths increase. The Tensile Truss also has the added benefits of increased safety, ease of decontamination, superior stiffness and ability to withstand excessive side loading. A Tensile Truss system is currently being considered for D and D operations and spent fuel recovery at the Fukushima Daiichi Nuclear Power Plant in Japan. This system will deploy interchangeable tools such as underwater hydraulic manipulators, hydraulic shears and crushers, grippers and fuel grapples. (authors)« less

  17. Cardiopulmonary Exercise Testing in Adult Congenital Heart Disease.

    PubMed

    Mantegazza, Valentina; Apostolo, Anna; Hager, Alfred

    2017-07-01

    Recently, the number of patients with congenital heart diseases reaching adulthood has been progressively increasing in developed countries, and new issues are emerging: the evaluation of their capacity to cope with physical activity and whether this knowledge can be used to optimize medical management. A symptom-limited cardiopulmonary exercise test has proven to be an essential tool, because it can objectively evaluate the functional cardiovascular capacity of these patients, identify the pathological mechanisms of the defect (circulatory failure, shunts, and/or pulmonary hypertension), and help prescribe an individualized rehabilitation program when needed. The common findings on cardiopulmonary exercise testing in patients with congenital heart diseases are a reduced peak [Formula: see text]o 2 , an early anaerobic threshold, a blunted heart rate response, a reduced increase of Vt, and an increased [Formula: see text]e/[Formula: see text]co 2 . All these measures suggest common pathophysiological abnormalities: (1) a compromised exercise capacity from anomalies affecting the heart, vessels, lungs, or muscles; (2) chronotropic incompetence secondary to cardiac autonomic dysfunction or β-blockers and antiarrhythmic therapy; and (3) ventilatory inefficiency caused by left-heart failure with pulmonary congestion, pulmonary hypertension, pulmonary obstructive vascular disease, or cachexia. Most of these variables also have prognostic significance. For these patients, cardiopulmonary exercise testing allows evaluation and decisions affecting lifestyle and therapeutic interventions.

  18. Detection of nerve gases using surface-enhanced Raman scattering substrates with high droplet adhesion

    NASA Astrophysics Data System (ADS)

    Hakonen, Aron; Rindzevicius, Tomas; Schmidt, Michael Stenbæk; Andersson, Per Ola; Juhlin, Lars; Svedendahl, Mikael; Boisen, Anja; Käll, Mikael

    2016-01-01

    Threats from chemical warfare agents, commonly known as nerve gases, constitute a serious security issue of increasing global concern because of surging terrorist activity worldwide. However, nerve gases are difficult to detect using current analytical tools and outside dedicated laboratories. Here we demonstrate that surface-enhanced Raman scattering (SERS) can be used for sensitive detection of femtomol quantities of two nerve gases, VX and Tabun, using a handheld Raman device and SERS substrates consisting of flexible gold-covered Si nanopillars. The substrate surface exhibits high droplet adhesion and nanopillar clustering due to elasto-capillary forces, resulting in enrichment of target molecules in plasmonic hot-spots with high Raman enhancement. The results may pave the way for strategic life-saving SERS detection of chemical warfare agents in the field.Threats from chemical warfare agents, commonly known as nerve gases, constitute a serious security issue of increasing global concern because of surging terrorist activity worldwide. However, nerve gases are difficult to detect using current analytical tools and outside dedicated laboratories. Here we demonstrate that surface-enhanced Raman scattering (SERS) can be used for sensitive detection of femtomol quantities of two nerve gases, VX and Tabun, using a handheld Raman device and SERS substrates consisting of flexible gold-covered Si nanopillars. The substrate surface exhibits high droplet adhesion and nanopillar clustering due to elasto-capillary forces, resulting in enrichment of target molecules in plasmonic hot-spots with high Raman enhancement. The results may pave the way for strategic life-saving SERS detection of chemical warfare agents in the field. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06524k

  19. Using bio.tools to generate and annotate workbench tool descriptions

    PubMed Central

    Hillion, Kenzo-Hugo; Kuzmin, Ivan; Khodak, Anton; Rasche, Eric; Crusoe, Michael; Peterson, Hedi; Ison, Jon; Ménager, Hervé

    2017-01-01

    Workbench and workflow systems such as Galaxy, Taverna, Chipster, or Common Workflow Language (CWL)-based frameworks, facilitate the access to bioinformatics tools in a user-friendly, scalable and reproducible way. Still, the integration of tools in such environments remains a cumbersome, time consuming and error-prone process. A major consequence is the incomplete or outdated description of tools that are often missing important information, including parameters and metadata such as publication or links to documentation. ToolDog (Tool DescriptiOn Generator) facilitates the integration of tools - which have been registered in the ELIXIR tools registry (https://bio.tools) - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates a skeleton for a Galaxy XML or CWL tool description. The second module is dedicated to the enrichment of the generated tool description, using metadata provided by bio.tools. This last module can also be used on its own to complete or correct existing tool descriptions with missing metadata. PMID:29333231

  20. Enforced Sparse Non-Negative Matrix Factorization

    DTIC Science & Technology

    2016-01-23

    documents to find interesting pieces of information. With limited resources, analysts often employ automated text - mining tools that highlight common...represented as an undirected bipartite graph. It has become a common method for generating topic models of text data because it is known to produce good results...model and the convergence rate of the underlying algorithm. I. Introduction A common analyst challenge is searching through large quantities of text

Top