Sample records for general purpose tools

  1. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  2. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  3. 41 CFR 60-741.40 - General purpose and applicability of the affirmative action program requirement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 41 Public Contracts and Property Management 1 2014-07-01 2014-07-01 false General purpose and... Property Management Other Provisions Relating to Public Contracts OFFICE OF FEDERAL CONTRACT COMPLIANCE... requirement. (a) General purpose. An affirmative action program is a management tool designed to ensure equal...

  4. 47 CFR 32.2114 - Tools and other work equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Balance Sheet Accounts § 32... equipment, general purpose tools, and other items of work equipment. [64 FR 50007, Sept. 15, 1999] ...

  5. Investigating the Digital Addiction Level of the University Students According to Their Purposes for Using Digital Tools

    ERIC Educational Resources Information Center

    Kesici, Ahmet; Tunç, Nazenin Fidan

    2018-01-01

    This study was carried out to investigate the digital addiction (DA) level of the university students according to their purposes for using digital tools. 527 students studying at the faculties of education of Erzincan, Dicle, and Siirt Universities participated this study in which general survey model was used. A form was used to reveal for which…

  6. 29 CFR 1915.131 - General precautions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... electric cords for this purpose is prohibited. (b) When air tools of the reciprocating type are not in use, the dies and tools shall be removed. (c) All portable, power-driven circular saws shall be equipped... whip. (f) The moving parts of drive mechanisms, such as gearing and belting on large portable tools...

  7. 29 CFR 1915.131 - General precautions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... electric cords for this purpose is prohibited. (b) When air tools of the reciprocating type are not in use, the dies and tools shall be removed. (c) All portable, power-driven circular saws shall be equipped... whip. (f) The moving parts of drive mechanisms, such as gearing and belting on large portable tools...

  8. 29 CFR 1915.131 - General precautions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... electric cords for this purpose is prohibited. (b) When air tools of the reciprocating type are not in use, the dies and tools shall be removed. (c) All portable, power-driven circular saws shall be equipped... whip. (f) The moving parts of drive mechanisms, such as gearing and belting on large portable tools...

  9. Generalized Fluid System Simulation Program (GFSSP) Version 6 - General Purpose Thermo-Fluid Network Analysis Software

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Leclair, Andre; Moore, Ric; Schallhorn, Paul

    2011-01-01

    GFSSP stands for Generalized Fluid System Simulation Program. It is a general-purpose computer program to compute pressure, temperature and flow distribution in a flow network. GFSSP calculates pressure, temperature, and concentrations at nodes and calculates flow rates through branches. It was primarily developed to analyze Internal Flow Analysis of a Turbopump Transient Flow Analysis of a Propulsion System. GFSSP development started in 1994 with an objective to provide a generalized and easy to use flow analysis tool for thermo-fluid systems.

  10. Rover Wheel-Actuated Tool Interface

    NASA Technical Reports Server (NTRS)

    Matthews, Janet; Ahmad, Norman; Wilcox, Brian

    2007-01-01

    A report describes an interface for utilizing some of the mobility features of a mobile robot for general-purpose manipulation of tools and other objects. The robot in question, now undergoing conceptual development for use on the Moon, is the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) rover, which is designed to roll over gentle terrain or walk over rough or steep terrain. Each leg of the robot is a six-degree-of-freedom general purpose manipulator tipped by a wheel with a motor drive. The tool interface includes a square cross-section peg, equivalent to a conventional socket-wrench drive, that rotates with the wheel. The tool interface also includes a clamp that holds a tool on the peg, and a pair of fold-out cameras that provides close-up stereoscopic images of the tool and its vicinity. The field of view of the imagers is actuated by the clamp mechanism and is specific to each tool. The motor drive can power any of a variety of tools, including rotating tools for helical fasteners, drills, and such clamping tools as pliers. With the addition of a flexible coupling, it could also power another tool or remote manipulator at a short distance. The socket drive can provide very high torque and power because it is driven by the wheel motor.

  11. Patient-Centered Tools for Medication Information Search

    PubMed Central

    Wilcox, Lauren; Feiner, Steven; Elhadad, Noémie; Vawdrey, David; Tran, Tran H.

    2016-01-01

    Recent research focused on online health information seeking highlights a heavy reliance on general-purpose search engines. However, current general-purpose search interfaces do not necessarily provide adequate support for non-experts in identifying suitable sources of health information. Popular search engines have recently introduced search tools in their user interfaces for a range of topics. In this work, we explore how such tools can support non-expert, patient-centered health information search. Scoping the current work to medication-related search, we report on findings from a formative study focused on the design of patient-centered, medication-information search tools. Our study included qualitative interviews with patients, family members, and domain experts, as well as observations of their use of Remedy, a technology probe embodying a set of search tools. Post-operative cardiothoracic surgery patients and their visiting family members used the tools to find information about their hospital medications and were interviewed before and after their use. Domain experts conducted similar search tasks and provided qualitative feedback on their preferences and recommendations for designing these tools. Findings from our study suggest the importance of four valuation principles underlying our tools: credibility, readability, consumer perspective, and topical relevance. PMID:28163972

  12. Patient-Centered Tools for Medication Information Search.

    PubMed

    Wilcox, Lauren; Feiner, Steven; Elhadad, Noémie; Vawdrey, David; Tran, Tran H

    2014-05-20

    Recent research focused on online health information seeking highlights a heavy reliance on general-purpose search engines. However, current general-purpose search interfaces do not necessarily provide adequate support for non-experts in identifying suitable sources of health information. Popular search engines have recently introduced search tools in their user interfaces for a range of topics. In this work, we explore how such tools can support non-expert, patient-centered health information search. Scoping the current work to medication-related search, we report on findings from a formative study focused on the design of patient-centered, medication-information search tools. Our study included qualitative interviews with patients, family members, and domain experts, as well as observations of their use of Remedy, a technology probe embodying a set of search tools. Post-operative cardiothoracic surgery patients and their visiting family members used the tools to find information about their hospital medications and were interviewed before and after their use. Domain experts conducted similar search tasks and provided qualitative feedback on their preferences and recommendations for designing these tools. Findings from our study suggest the importance of four valuation principles underlying our tools: credibility, readability, consumer perspective, and topical relevance.

  13. Exposure Assessment Tools by Lifestages and Populations - General Population

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  14. Implementation of the Automated Numerical Model Performance Metrics System

    DTIC Science & Technology

    2011-09-26

    question. As of this writing, the DSRC IBM AIX machines DaVinci and Pascal, and the Cray XT Einstein all use the PBS batch queuing system for...3.3). 12 Appendix A – General Automation System This system provides general purpose tools and a general way to automatically run

  15. Simrank: Rapid and sensitive general-purpose k-mer search tool

    PubMed Central

    2011-01-01

    Background Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project http://nihroadmap.nih.gov/hmp. Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Results Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Conclusions Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity. PMID:21524302

  16. 47 CFR 32.1220 - Inventories.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... stock including plant supplies, motor vehicles supplies, tools, fuel, other supplies and material and... reporting purposes, as appropriate, in accordance with generally accepted accounting principles. The...

  17. 47 CFR 32.1220 - Inventories.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... stock including plant supplies, motor vehicles supplies, tools, fuel, other supplies and material and... reporting purposes, as appropriate, in accordance with generally accepted accounting principles. The...

  18. 47 CFR 32.1220 - Inventories.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... stock including plant supplies, motor vehicles supplies, tools, fuel, other supplies and material and... reporting purposes, as appropriate, in accordance with generally accepted accounting principles. The...

  19. 47 CFR 32.1220 - Inventories.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... stock including plant supplies, motor vehicles supplies, tools, fuel, other supplies and material and... reporting purposes, as appropriate, in accordance with generally accepted accounting principles. The...

  20. Integrating and Managing Bim in GIS, Software Review

    NASA Astrophysics Data System (ADS)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  1. Tomorrows' Air Transportation System Breakout Series Report

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The purpose of this presentation is to discuss tomorrow's air transportation system. Section of this presentation includes: chair comments; other general comments; surface congestion alleviation; runway productivity; enhanced arrival/departure tools; integrated airspace decision support tools; national traffic flow management, runway independent operations; ATM TFM weather; and terminal weather.

  2. A Comparison of Parameter Study Creation and Job Submission Tools

    NASA Technical Reports Server (NTRS)

    DeVivo, Adrian; Yarrow, Maurice; McCann, Karen M.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We consider the differences between the available general purpose parameter study and job submission tools. These tools necessarily share many features, but frequently with differences in the way they are designed and implemented For this class of features, we will only briefly outline the essential differences. However we will focus on the unique features which distinguish the ILab parameter study and job submission tool from other packages, and which make the ILab tool easier and more suitable for use in our research and engineering environment.

  3. 77 FR 77038 - Procurement List; Proposed Additions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ... purpose is to provide interested persons an opportunity to submit comments on the proposed actions...: Keystone Vocational Services, Inc., Sharon, PA Contracting Activity: General Services Administration, Tools...

  4. Using Mathematica to Teach Process Units: A Distillation Case Study

    ERIC Educational Resources Information Center

    Rasteiro, Maria G.; Bernardo, Fernando P.; Saraiva, Pedro M.

    2005-01-01

    The question addressed here is how to integrate computational tools, namely interactive general-purpose platforms, in the teaching of process units. Mathematica has been selected as a complementary tool to teach distillation processes, with the main objective of leading students to achieve a better understanding of the physical phenomena involved…

  5. Distance Students' Readiness for Social Media and Collaboration

    ERIC Educational Resources Information Center

    Poellhuber, Bruno; Anderson, Terry

    2011-01-01

    In recent years, there has been a rapid growth in the use of social networking tools (e.g., Facebook) and social media in general, mainly for social purposes (Smith, Salaway & Caruso 2009). Many educators, including ourselves, believe that these tools offer new educational affordances and avenues for students to interact with each other and…

  6. MACHINE TOOL OPERATOR--GENERAL, ENTRY, SUGGESTED GUIDE FOR A TRAINING COURSE.

    ERIC Educational Resources Information Center

    RONEY, MAURICE W.; AND OTHERS

    THE PURPOSE OF THIS CURRICULUM GUIDE IS TO ASSIST THE ADMINISTRATOR AND INSTRUCTOR IN PLANNING AND DEVELOPING MANPOWER DEVELOPMENT AND TRAINING PROGRAMS TO PREPARE MACHINE TOOL OPERATORS FOR ENTRY-LEVEL POSITIONS. THE COURSE OUTLINE PROVIDES UNITS IN -- (1) ORIENTATION, (2) BENCH WORK, (3) SHOP MATHEMATICS, (4) BLUEPRINT READING AND SKETCHING, (5)…

  7. Care 3, phase 1, volume 2

    NASA Technical Reports Server (NTRS)

    Stiffler, J. J.; Bryant, L. A.; Guccione, L.

    1979-01-01

    A computer program was developed as a general purpose reliability tool for fault tolerant avionics systems. The computer program requirements, together with several appendices containing computer printouts are presented.

  8. 24 CFR 902.1 - Purpose, scope, and general matters.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... agencies (PHAs), public housing residents, and the general public, by providing a management tool for... requirements for poor performers. (b) Scope. PHAS is a strategic measure of the essential housing operations of... indicators, which are more fully addressed in § 902.9: Physical condition, financial condition, management...

  9. Perspective Tools of the Strategic Management of VFR Tourism Development at the Regional Level

    ERIC Educational Resources Information Center

    Gorbunov, Aleksandr P.; Efimova, Ekaterina V.; Kobets, Margarita V.; Kilinkarova, Sofiya G.

    2016-01-01

    This study is aimed at identifying the perspective tools of strategic management in general and strategic planning of VFR tourism (for the purpose of visiting friends and relatives) at the regional level in particular. It is based on dialectical and logical methods, analysis and synthesis, induction and deduction, the concrete historical and…

  10. General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark

    2010-01-01

    Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.

  11. Workplace Learning among General Practitioners and Specialists: The Use of Videoconferencing as a Tool

    ERIC Educational Resources Information Center

    Nilsen, Line Lundvoll

    2011-01-01

    Purpose: Videoconferencing between general practitioners and hospitals has been developed to provide higher quality health care services in Norway by promoting interaction between levels of care. This article aims to explore the use of videoconferencing for information exchange and consultation throughout the patient trajectory and to investigate…

  12. Requirements Specification Language (RSL) and supporting tools

    NASA Technical Reports Server (NTRS)

    Frincke, Deborah; Wolber, Dave; Fisher, Gene; Cohen, Gerald C.

    1992-01-01

    This document describes a general purpose Requirement Specification Language (RSL). RSL is a hybrid of features found in several popular requirement specification languages. The purpose of RSL is to describe precisely the external structure of a system comprised of hardware, software, and human processing elements. To overcome the deficiencies of informal specification languages, RSL includes facilities for mathematical specification. Two RSL interface tools are described. The Browser view contains a complete document with all details of the objects and operations. The Dataflow view is a specialized, operation-centered depiction of a specification that shows how specified operations relate in terms of inputs and outputs.

  13. HyperCard as a Text Analysis Tool for the Qualitative Researcher.

    ERIC Educational Resources Information Center

    Handler, Marianne G.; Turner, Sandra V.

    HyperCard is a general-purpose program for the Macintosh computer that allows multiple ways of viewing and accessing a large body of information. Two ways in which HyperCard can be used as a research tool are illustrated. One way is to organize and analyze qualitative data from observations, interviews, surveys, and other documents. The other way…

  14. A De Novo Tool to Measure the Preclinical Learning Climate of Medical Faculties in Turkey

    ERIC Educational Resources Information Center

    Yilmaz, Nilufer Demiral; Velipasaoglu, Serpil; Sahin, Hatice; Basusta, Bilge Uzun; Midik, Ozlem; Coskun, Ozlem; Budakoglu, Isil Irem; Mamakli, Sumer; Tengiz, Funda Ifakat; Durak, Halil Ibrahim; Ozan, Sema

    2015-01-01

    Although several scales are used to measure general and clinical learning climates, there are no scales that assess the preclinical learning climate. Therefore, the purpose of this study was to develop an effective measurement tool in order to assess the preclinical learning climate. In this cross-sectional study, data were collected from 3,540…

  15. General-Purpose Electronic System Tests Aircraft

    NASA Technical Reports Server (NTRS)

    Glover, Richard D.

    1989-01-01

    Versatile digital equipment supports research, development, and maintenance. Extended aircraft interrogation and display system is general-purpose assembly of digital electronic equipment on ground for testing of digital electronic systems on advanced aircraft. Many advanced features, including multiple 16-bit microprocessors, pipeline data-flow architecture, advanced operating system, and resident software-development tools. Basic collection of software includes program for handling many types of data and for displays in various formats. User easily extends basic software library. Hardware and software interfaces to subsystems provided by user designed for flexibility in configuration to meet user's requirements.

  16. A special purpose silicon compiler for designing supercomputing VLSI systems

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  17. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  18. Digital optical computers at the optoelectronic computing systems center

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  19. Advanced instrumentation for QELS experiments

    NASA Technical Reports Server (NTRS)

    Tscharnuter, Walther; Weiner, Bruce; Thomas, John

    1989-01-01

    Quasi Elastic Light Scattering (QELS) experiments have become an important tool in both research and quality control applications during the past 25 years. From the crude beginnings employing mechanically driven spectrum analyzers, an impressive array of general purpose digital correlators and special purpose particle sizers is now commercially available. The principles of QELS experiments are reviewed, their advantages and disadvantages are discussed and new instrumentation is described.

  20. Problem solving with genetic algorithms and Splicer

    NASA Technical Reports Server (NTRS)

    Bayer, Steven E.; Wang, Lui

    1991-01-01

    Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.

  1. Optimization-based interactive segmentation interface for multiregion problems

    PubMed Central

    Baxter, John S. H.; Rajchl, Martin; Peters, Terry M.; Chen, Elvis C. S.

    2016-01-01

    Abstract. Interactive segmentation is becoming of increasing interest to the medical imaging community in that it combines the positive aspects of both manual and automated segmentation. However, general-purpose tools have been lacking in terms of segmenting multiple regions simultaneously with a high degree of coupling between groups of labels. Hierarchical max-flow segmentation has taken advantage of this coupling for individual applications, but until recently, these algorithms were constrained to a particular hierarchy and could not be considered general-purpose. In a generalized form, the hierarchy for any given segmentation problem is specified in run-time, allowing different hierarchies to be quickly explored. We present an interactive segmentation interface, which uses generalized hierarchical max-flow for optimization-based multiregion segmentation guided by user-defined seeds. Applications in cardiac and neonatal brain segmentation are given as example applications of its generality. PMID:27335892

  2. Screamer version 4.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spielman, Rick; Struve, Kenneth W.; Kiefer, Mark L.

    2017-02-16

    Screamer is a special purpose circuit code developed for the design of Pulsed Power systems. It models electrical circuits which have a restricted topology in order to provide a fast-running tool while still allowing configurations general enough for most Pulsed Power system designs

  3. Discourse analysis in general practice: a sociolinguistic approach.

    PubMed

    Nessa, J; Malterud, K

    1990-06-01

    It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here.

  4. Workshop on dimensional analysis for design, development, and research executives

    NASA Technical Reports Server (NTRS)

    Goodman, R. A.; Abernathy, W. J.

    1971-01-01

    The proceedings of a conference of research and development executives are presented. The purpose of the meeting was to develop an understanding of the conditions which are appropriate for the use of certain general management tools and those conditions which render these tools inappropriate. The verbatim statements of the participants are included to show the direction taken initially by the conference. Formal presentations of management techniques for research and development are developed.

  5. Netbook User’s Guide and Installation Manual.

    DTIC Science & Technology

    1997-01-31

    The general purpose of Netbook is to add value to the information available online, by developing a collaborative environment within which that...information can be effectively accessed, stored, annotated, and structured. Netbook is a prototype tool that provides users with the capacity for

  6. Trends in Programming Languages for Neuroscience Simulations

    PubMed Central

    Davison, Andrew P.; Hines, Michael L.; Muller, Eilif

    2009-01-01

    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing. PMID:20198154

  7. Trends in programming languages for neuroscience simulations.

    PubMed

    Davison, Andrew P; Hines, Michael L; Muller, Eilif

    2009-01-01

    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing.

  8. Development of a Real-Time General-Purpose Digital Signal Processing Laboratory System.

    DTIC Science & Technology

    1983-12-01

    should serve several important purposes: to familiarize students with the use of common DSP tools in an instructional environment, to serve as a research ...of Dayton Research Institute researchers for DSP software and DSP system design insight. 3. Formulation of statement of requirements for development...Neither the University of Dayton nor its Research Institute have a DSP computer system. While UD offered no software or DSP system design information

  9. Instructor guide : managing operating cost for rural and small urban transit systems.

    DOT National Transportation Integrated Search

    2013-01-01

    The purpose of the workshop is to provide rural and small urban transit managers and staff with tools to analyze, track, predict, and manage operational costs. The workshop will have a beginning and ending general session, and will provide six sessio...

  10. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    NASA Technical Reports Server (NTRS)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  11. Functional specifications for AI software tools for electric power applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, W.S.

    1985-08-01

    The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less

  12. Floating-Point Modules Targeted for Use with RC Compilation Tools

    NASA Technical Reports Server (NTRS)

    Sahin, Ibrahin; Gloster, Clay S.

    2000-01-01

    Reconfigurable Computing (RC) has emerged as a viable computing solution for computationally intensive applications. Several applications have been mapped to RC system and in most cases, they provided the smallest published execution time. Although RC systems offer significant performance advantages over general-purpose processors, they require more application development time than general-purpose processors. This increased development time of RC systems provides the motivation to develop an optimized module library with an assembly language instruction format interface for use with future RC system that will reduce development time significantly. In this paper, we present area/performance metrics for several different types of floating point (FP) modules that can be utilized to develop complex FP applications. These modules are highly pipelined and optimized for both speed and area. Using these modules, and example application, FP matrix multiplication, is also presented. Our results and experiences show, that with these modules, 8-10X speedup over general-purpose processors can be achieved.

  13. Multi-modal virtual environment research at Armstrong Laboratory

    NASA Technical Reports Server (NTRS)

    Eggleston, Robert G.

    1995-01-01

    One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.

  14. A fast ultrasonic simulation tool based on massively parallel implementations

    NASA Astrophysics Data System (ADS)

    Lambert, Jason; Rougeron, Gilles; Lacassagne, Lionel; Chatillon, Sylvain

    2014-02-01

    This paper presents a CIVA optimized ultrasonic inspection simulation tool, which takes benefit of the power of massively parallel architectures: graphical processing units (GPU) and multi-core general purpose processors (GPP). This tool is based on the classical approach used in CIVA: the interaction model is based on Kirchoff, and the ultrasonic field around the defect is computed by the pencil method. The model has been adapted and parallelized for both architectures. At this stage, the configurations addressed by the tool are : multi and mono-element probes, planar specimens made of simple isotropic materials, planar rectangular defects or side drilled holes of small diameter. Validations on the model accuracy and performances measurements are presented.

  15. Assessment of General Chemistry Instruction

    ERIC Educational Resources Information Center

    Bergin, Adam; Sharp, Kevan; Gatlin, Todd A.; Villalta-Cerdas, Adrian; Gower, Austin; Sandi-Urena, Santiago

    2013-01-01

    Commercial online instructor evaluations have gained traction in influencing students' decisions on professor and course selections at universities. RateMyProfessors.com (RMP) is the most popular of such evaluation tools and houses a wealth of information from the students' viewpoint. The purpose of this study was to determine whether RMP data…

  16. 29 CFR 1915.131 - General precautions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., shall be adequately guarded. (g) Headers, manifolds and widely spaced hose connections on compressed air.... Grouped air connections may be marked in one location. (h) Before use, compressed air hose shall be... electric cords for this purpose is prohibited. (b) When air tools of the reciprocating type are not in use...

  17. Concept Map as an Assessment Tool in Secondary School Mathematics: An Analysis of Teachers' Perspectives

    ERIC Educational Resources Information Center

    Mutodi, Paul; Chigonga, Benard

    2016-01-01

    This paper reports on teachers' views on concept mapping: its applicability; reliability; advantages and; difficulties. A close-ended questionnaire was administered to 50 purposefully selected secondary school mathematics teachers from Sekhukhune District, Limpopo, South Africa. The findings indicate that mathematics teachers generally perceive…

  18. Florida City & County Government. A Condensed Reference Version.

    ERIC Educational Resources Information Center

    Massialas, Byron; Jenkins, Ann

    Designed to serve as a reference tool on city and county government in Florida, this handbook consists of lessons that can be used by schools, community groups, newly elected officials, and libraries. These curriculum materials on Florida city and county governments specifically address the general purpose of local governments. Subject areas…

  19. Historical Development of Simulation Models of Recreation Use

    Treesearch

    Jan W. van Wagtendonk; David N. Cole

    2005-01-01

    The potential utility of modeling as a park and wilderness management tool has been recognized for decades. Romesburg (1974) explored how mathematical decision modeling could be used to improve decisions about regulation of wilderness use. Cesario (1975) described a computer simulation modeling approach that utilized GPSS (General Purpose Systems Simulator), a...

  20. Check Out Your Shop Planning.

    ERIC Educational Resources Information Center

    Brant, Herbert M.

    1967-01-01

    A comprehensive checklist is presented for assistance in planning and remodeling all types of industrial arts facilities. Items to be rated are in the form of suggestions or specifications related to facility function. Categories developed include--(1) purpose, (2) general laboratory arrangement, (3) hand tools and storage, (4) room safety, (5)…

  1. Developments and Changes Resulting from Writing and Thinking Assessment

    ERIC Educational Resources Information Center

    Flateby, Teresa

    2009-01-01

    This article chronicles the evolution of a large research extensive institution's General Education writing assessment efforts from an initial summative focus to a formative, improvement focus. The methods of assessment, which changed as the assessment purpose evolved, are described. As more data were collected, the measurement tool was…

  2. COMPASS: A general purpose computer aided scheduling tool

    NASA Technical Reports Server (NTRS)

    Mcmahon, Mary Beth; Fox, Barry; Culbert, Chris

    1991-01-01

    COMPASS is a generic scheduling system developed by McDonnell Douglas under the direction of the Software Technology Branch at JSC. COMPASS is intended to illustrate the latest advances in scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to potential NASA Space Station Freedom standards. COMPASS has some unique characteristics that distinguishes it from commercial products. These characteristics are discussed and used to illustrate some differences between scheduling tools.

  3. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    DOE PAGES

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; ...

    2015-11-09

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here in this paper, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive.

  4. ATHLETE: Lunar Cargo Unloading from a High Deck

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2010-01-01

    As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are at least comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be lighter than a conventional all-terrain mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of freedom to be used as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A power-take-off from the wheel actuates the tools, so that they can take advantage of the 1+ horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.

  5. ATHLETE: a Cargo and Habitat Transporter for the Moon

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2009-01-01

    As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. The vehicle concept is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through (or at least out of) extreme terrain, the wheels and wheel actuators can be sized only for nominal terrain. There are substantial mass savings in the wheels and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25 percent lighter than a conventional mobility chassis for planetary exploration. A side benefit of this approach is that each limb has sufficient degrees-of-freedom for use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A rotating power-take-off from the wheel actuates the tools, so that they can take advantage of the 1-plus-horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.

  6. SCNS: a graphical tool for reconstructing executable regulatory networks from single-cell genomic data.

    PubMed

    Woodhouse, Steven; Piterman, Nir; Wintersteiger, Christoph M; Göttgens, Berthold; Fisher, Jasmin

    2018-05-25

    Reconstruction of executable mechanistic models from single-cell gene expression data represents a powerful approach to understanding developmental and disease processes. New ambitious efforts like the Human Cell Atlas will soon lead to an explosion of data with potential for uncovering and understanding the regulatory networks which underlie the behaviour of all human cells. In order to take advantage of this data, however, there is a need for general-purpose, user-friendly and efficient computational tools that can be readily used by biologists who do not have specialist computer science knowledge. The Single Cell Network Synthesis toolkit (SCNS) is a general-purpose computational tool for the reconstruction and analysis of executable models from single-cell gene expression data. Through a graphical user interface, SCNS takes single-cell qPCR or RNA-sequencing data taken across a time course, and searches for logical rules that drive transitions from early cell states towards late cell states. Because the resulting reconstructed models are executable, they can be used to make predictions about the effect of specific gene perturbations on the generation of specific lineages. SCNS should be of broad interest to the growing number of researchers working in single-cell genomics and will help further facilitate the generation of valuable mechanistic insights into developmental, homeostatic and disease processes.

  7. ATHLETE: A Limbed Vehicle for Solar System Exploration

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2012-01-01

    As part of the Human-Robot Systems project funded by NASA, the Jet Propulsion Laboratory has developed a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25% lighter than a conventional mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of-freedom to use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb.

  8. EURRECA: development of tools to improve the alignment of micronutrient recommendations.

    PubMed

    Matthys, C; Bucchini, L; Busstra, M C; Cavelaars, A E J M; Eleftheriou, P; Garcia-Alvarez, A; Fairweather-Tait, S; Gurinović, M; van Ommen, B; Contor, L

    2010-11-01

    Approaches through which reference values for micronutrients are derived, as well as the reference values themselves, vary considerably across countries. Harmonisation is needed to improve nutrition policy and public health strategies. The EURRECA (EURopean micronutrient RECommendations Aligned, http://www.eurreca.org) Network of Excellence is developing generic tools for systematically establishing and updating micronutrient reference values or recommendations. Different types of instruments (including best practice guidelines, interlinked web pages, online databases and decision trees) have been identified. The first set of instruments is for training purposes and includes mainly interactive digital learning materials. The second set of instruments comprises collection and interlinkage of diverse information sources that have widely varying contents and purposes. In general, these sources are collections of existing information. The purpose of the majority of these information sources is to provide guidance on best practice for use in a wider scientific community or for users and stakeholders of reference values. The third set of instruments includes decision trees and frameworks. The purpose of these tools is to guide non-scientists in decision making based on scientific evidence. This platform of instruments will, in particular in Central and Eastern European countries, contribute to future capacity-building development in nutrition. The use of these tools by the scientific community, the European Food Safety Authority, bodies responsible for setting national nutrient requirements and others should ultimately help to align nutrient-based recommendations across Europe. Therefore, EURRECA can contribute towards nutrition policy development and public health strategies.

  9. THE New ROSIE (Rule Oriented System for Implementing Expertise) (Trade Name) Reference Manual and User’s Guide

    DTIC Science & Technology

    1987-06-01

    advanced I/O operations * extended variations of the data types and control structures found in most symbolic languages Features such as rulesets and...paradigms. Because of its "general-purpose" flavor, it is less structured and more , flexible than many contemporary Al systems and tools. Nonetheless...operations. Such operations were hardwired into the language because they did not fit easily into any general linguistic -. structure . Some operations

  10. Task Analysis Strategies and Practices. Practice Application Brief.

    ERIC Educational Resources Information Center

    Brown, Bettina Lankard

    Worker-oriented, job-oriented, and cognitive task analyses have all been used as tools for closing the gap between what curriculum teaches and what workers do. Although they share a commonality of purpose, the focus, cost, and practicality of task analysis techniques vary. Worker-oriented task analysis focuses on general human behaviors required…

  11. An Educational Tool for Outdoor Education and Environmental Concern

    ERIC Educational Resources Information Center

    Sandell, Klas; Ohman, Johan

    2013-01-01

    The purpose of this paper is to suggest an outdoor education model that respects the need to critically discuss the general belief in a causal relationship between experiences of nature, environmentally-friendly attitudes and behavioural change, but that at the same time respects the legitimate claims on the part of outdoor education practice for…

  12. Generalization Following Tablet-Based Instruction in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Chebli, Sabine Saade; Lanovaz, Marc J.; Dufour, Marie-Michèle

    2017-01-01

    Given that children with autism spectrum disorders (ASDs) often require one-to-one individualized instruction, using tablets as teaching tools may represent an interesting option in classrooms with high student to teacher ratios. The purpose of our study was to extend research by evaluating the effects of tablet-based instruction on the…

  13. Entertainment-Education and the Ethics of Social Intervention.

    ERIC Educational Resources Information Center

    Cambridge, Vibert; And Others

    More specifically than the general concept of "development," the use of entertainment media as a tool for social intervention implies the purposive utilization of the mass media to engineer specific changes in knowledge, attitudes, or practice. Thus, this type of use of the entertainment media is inseparable from the notion of "what…

  14. Native Networking: Telecommunications and Information Technology in Indian Country.

    ERIC Educational Resources Information Center

    Casey, James; Ross, Randy; Warren, Marcia

    This report on the status of telecommunications and information technology in Indian Country was created as a tool for reference, training, planning, and general educational purposes to be used by Native Americans, government policy makers, and others. A background section discusses policy and the current state of Native communities with regard to…

  15. How to Compute the Partial Fraction Decomposition without Really Trying

    ERIC Educational Resources Information Center

    Brazier, Richard; Boman, Eugene

    2007-01-01

    For various reasons there has been a recent trend in college and high school calculus courses to de-emphasize teaching the Partial Fraction Decomposition (PFD) as an integration technique. This is regrettable because the Partial Fraction Decomposition is considerably more than an integration technique. It is, in fact, a general purpose tool which…

  16. MYRaf: An Easy Aperture Photometry GUI for IRAF

    NASA Astrophysics Data System (ADS)

    Niaei, M. S.; KiliÇ, Y.; Özeren, F. F.

    2015-07-01

    We describe the design and development of MYRaf, a GUI (Graphical User Interface) that aims to be completely open-source under General Public License (GPL). MYRaf is an easy to use, reliable, and a fast IRAF aperture photometry GUI tool for those who are conversant with text-based software and command-line procedures in GNU/Linux OSs. MYRaf uses IRAF, PyRAF, matplotlib, ginga, alipy, and SExtractor with the general-purpose and high-level programming language Python, and uses the Qt framework.

  17. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  18. Using the 5Ps Leadership Analysis to Examine the Battle of Antietam: An Explanation and Case Study

    ERIC Educational Resources Information Center

    Hull, Bradley Z.; Allen, Scott J.

    2012-01-01

    The authors describe an exploratory analytical tool called "The 5Ps Leadership Analysis" (Personal Attributes, Position, Purpose, Practices/Processes, and Product) as a heuristic for better understanding the complexities of leadership. Using "The 5Ps Leadership Analysis," the authors explore the leadership of General Robert E.…

  19. TAKING THE LONG VIEW TOWARDS THE LONG WAR. Equipping General Purpose Force Leaders with Soft Power Tools for Irregular Warfare

    DTIC Science & Technology

    2009-02-12

    equivalent to usual printing or typescript . Can read either representations of familiar formulaic verbal exchanges or simple language containing only...read simple, authentic written material in a form equivalent to usual printing or typescript on subjects within a familiar context. Able to read with

  20. ToonTalk(TM)--An Animated Programming Environment for Children.

    ERIC Educational Resources Information Center

    Kahn, Ken

    This paper describes ToonTalk, a general-purpose concurrent programming system in which the source code is animated and the programming environment is a video game. The design objectives of ToonTalk were to create a self-teaching programming system for children that was also a very powerful and flexible programming tool. A keyboard can be used for…

  1. Chapter 13 - Perspectives on LANDFIRE Prototype Project Accuracy Assessment

    Treesearch

    James Vogelmann; Zhiliang Zhu; Jay Kost; Brian Tolk; Donald Ohlen

    2006-01-01

    The purpose of this chapter is to provide a general overview of the many aspects of accuracy assessment pertinent to the Landscape Fire and Resource Management Planning Tools Prototype Project (LANDFIRE Prototype Project). The LANDFIRE Prototype formed a large and complex research and development project with many broad-scale data sets and products developed throughout...

  2. Tools and technologies for expert systems: A human factors perspective

    NASA Technical Reports Server (NTRS)

    Rajaram, Navaratna S.

    1987-01-01

    It is widely recognized that technologies based on artificial intelligence (AI), especially expert systems, can make significant contributions to the productivity and effectiveness of operations of information and knowledge intensive organizations such as NASA. At the same time, these being relatively new technologies, there is the problem of transfering technology to key personnel of such organizations. The problems of examining the potential of expert systems and of technology transfer is addressed in the context of human factors applications. One of the topics of interest was the investigation of the potential use of expert system building tools, particularly NEXPERT as a technology transfer medium. Two basic conclusions were reached in this regard. First, NEXPERT is an excellent tool for rapid prototyping of experimental expert systems, but not ideal as a delivery vehicle. Therefore, it is not a substitute for general purpose system implementation languages such a LISP or C. This assertion probably holds for nearly all such tools on the market today. Second, an effective technology transfer mechanism is to formulate and implement expert systems for problems which members of the organization in question can relate to. For this purpose, the LIghting EnGineering Expert (LIEGE) was implemented using NEXPERT as the tool for technology transfer and to illustrate the value of expert systems to the activities of the Man-System Division.

  3. Micro electrical discharge milling using deionized water as a dielectric fluid

    NASA Astrophysics Data System (ADS)

    Chung, Do Kwan; Kim, Bo Hyun; Chu, Chong Nam

    2007-05-01

    In electrical discharge machining, dielectric fluid is an important factor affecting machining characteristics. Generally, kerosene and deionized water have been used as dielectric fluids. In micro electrical discharge milling, which uses a micro electrode as a tool, the wear of the tool electrode decreases the machining accuracy. However, the use of deionized water instead of kerosene can reduce the tool wear and increase the machining speed. This paper investigates micro electrical discharge milling using deionized water. Deionized water with high resistivity was used to minimize the machining gap. Machining characteristics such as the tool wear, machining gap and machining rate were investigated according to resistivity of deionized water. As the resistivity of deionized water decreased, the tool wear was reduced, but the machining gap increased due to electrochemical dissolution. Micro hemispheres were machined for the purpose of investigating machining efficiency between dielectric fluids, kerosene and deionized water.

  4. 2005 TACOM APBI - Partnering to Reset, Recapitalize and Restructure the Force

    DTIC Science & Technology

    2005-10-28

    training. 28 Oct 05~APBI ~9~ Force Projection ~ Technology Challenges (cont.) Force Sustainment Systems Develop smart airdrop systems using Global... UART ). General Purpose Electronic Test Equipment (GPETE) Transform multiple conventional GPETE instruments into a single Virtual Instrument with a...Consists of tools and equipment to refill and repair carbon dioxide fire extinguishers. Rapid Runway Repair - Components include sand grid sections

  5. 75 FR 76973 - Florida Gas Transmission Company, LLC; Notice of Intent to Prepare an Environmental Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ... loop \\1\\ in Miami-Dade County, Florida. The project would also include the installation of a pig... loop allows more gas to be moved through the system. \\2\\ A ``pig'' is a tool that is inserted into and... purposes. A pig launcher is an aboveground facility where pigs are inserted into the pipeline. The general...

  6. Refrigeration Playbook. Heat Reclaim; Optimizing Heat Rejection and Refrigeration Heat Reclaim for Supermarket Energy Conservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reis, Chuck; Nelson, Eric; Armer, James

    The purpose of this playbook and accompanying spreadsheets is to generalize the detailed CBP analysis and to put tools in the hands of experienced refrigeration designers to evaluate multiple applications of refrigeration waste heat reclaim across the United States. Supermarkets with large portfolios of similar buildings can use these tools to assess the impact of large-scale implementation of heat reclaim systems. In addition, the playbook provides best practices for implementing heat reclaim systems to achieve the best long-term performance possible. It includes guidance on operations and maintenance as well as measurement and verification.

  7. A General Event Location Algorithm with Applications to Eclipse and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  8. A General Event Location Algorithm with Applications to Eclispe and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  9. Shuttle user analysis (study 2.2). Volume 3: Business risk and value of operations in space (BRAVO). Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.

  10. Validation of an instrument to measure inter-organisational linkages in general practice.

    PubMed

    Amoroso, Cheryl; Proudfoot, Judith; Bubner, Tanya; Jayasinghe, Upali W; Holton, Christine; Winstanley, Julie; Beilby, Justin; Harris, Mark F

    2007-12-03

    Linkages between general medical practices and external services are important for high quality chronic disease care. The purpose of this research is to describe the development, evaluation and use of a brief tool that measures the comprehensiveness and quality of a general practice's linkages with external providers for the management of patients with chronic disease. In this study, clinical linkages are defined as the communication, support, and referral arrangements between services for the care and assistance of patients with chronic disease. An interview to measure surgery-level (rather than individual clinician-level) clinical linkages was developed, piloted, reviewed, and evaluated with 97 Australian general practices. Two validated survey instruments were posted to patients, and a survey of locally available services was developed and posted to participating Divisions of General Practice (support organisations). Hypotheses regarding internal validity, association with local services, and patient satisfaction were tested using factor analysis, logistic regression and multilevel regression models. The resulting General Practice Clinical Linkages Interview (GP-CLI) is a nine-item tool with three underlying factors: referral and advice linkages, shared care and care planning linkages, and community access and awareness linkages. Local availability of chronic disease services has no affect on the comprehensiveness of services with which practices link, however, comprehensiveness of clinical linkages has an association with patient assessment of access, receptionist services, and of continuity of care in their general practice. The GP-CLI may be useful to researchers examining comparable health care systems for measuring the comprehensiveness and quality of linkages at a general practice-level with related services, possessing both internal and external validity. The tool can be used with large samples exploring the impact, outcomes, and facilitators of high quality clinical linkages in general practice.

  11. Movement analysis of upper limb during resistance training using general purpose robot arm "PA10"

    NASA Astrophysics Data System (ADS)

    Morita, Yoshifumi; Yamamoto, Takashi; Suzuki, Takahiro; Hirose, Akinori; Ukai, Hiroyuki; Matsui, Nobuyuki

    2005-12-01

    In this paper we perform movement analysis of an upper limb during resistance training. We selected sanding training, which is one type of resistance training for upper limbs widely performed in occupational therapy. Our final aims in the future are to quantitatively evaluate the therapeutic effect of upper limb motor function during training and to develop a new rehabilitation training support system. For these purposes, first of all we perform movement analysis using a conventional training tool. By measuring upper limb motion during the sanding training we perform feature abstraction. Next we perform movement analysis using the simulated sanding training system. This system is constructed using the general purpose robot arm "PA10". This system enables us to measure the force/torque exerted by subjects and to easily change the load of resistance. The control algorithm is based on impedance control. We found these features of the upper limb motion during the sanding training.

  12. Investigation with respect to content and general properties of physics 10 textbook in accordance with the 2013 secondary school physics curriculum

    NASA Astrophysics Data System (ADS)

    Kavcar, Nevzat; Korkmaz, Cihan

    2017-02-01

    Purpose of this work is to determine the physics teacher candidates' views on Physics 10 textbook' content and general properties suitable to the 2013 Secondary School Physics Curriculum. 23 teacher candidates at 2014-2015 school year constituted the sampling of the study in which scanning model based on qualitative research technique was used by performing document analysis. Data collection tool of the research was the files prepared with 51 and nine open ended questions including the subject content and general properties of the textbook. It was concluded that the textbook was sufficient for being life context -based, language, activity-based and student-centered approximation, development of social and inquiry skills, and was insufficient for referring educational gains of the Curriculum, involving activities, projects and homework about application. Activities and applications about affective area, such tools for assessment and evaluation practices as concept map, concept network and semantic analysis table may be involved in the textbook.

  13. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  14. An evaluation of general practice websites in the UK.

    PubMed

    Howitt, Alistair; Clement, Sarah; de Lusignan, Simon; Thiru, Krish; Goodwin, Daryl; Wells, Sally

    2002-10-01

    General practice websites are an emerging phenomenon, but there have been few critical evaluations of their content. Previously developed rating instruments to assess medical websites have been criticized for failing to report their reliability and validity. The purpose of this study was to develop a rating instrument for assessing UK general practice websites, and then to evaluate them critically. The STaRNet Website Assessment Tool (SWAT) was developed listing criteria that general practice websites may meet, which was then used to evaluate a random sample of websites drawn from an electronic database. A second assessor rated a subsample of the sites to assess the tool's inter-rater reliability. The setting was an information technology group of a general practice research network using a random sample of 108 websites identified from the database. The main outcome measures were identification of rating criteria and frequency counts from the website rating instrument. Ninety (93.3%) sites were accessible, of which 84 were UK general practice websites. Criteria most frequently met were those describing the scope of the website and their functionality. Apart from e-mail to practices, criteria related to electronic communication were rarely met. Criteria relating to the quality of information were least often met. Inter-rater reliability kappa values for the items in the tool ranged from -0.06 to 1.0 (mean 0.59). Values were >0.6 for 15 out of 25 criteria assessed in 40 sites which were rated by two assessors. General practice websites offer a wide range of information. They are technically satisfactory, but do not exploit fully the potential for electronic doctor-patient communication. The quality of information they provide is poor. The instrument may be developed as a template for general practices producing or revising their own websites.

  15. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  16. Sparse filtering with the generalized lp/lq norm and its applications to the condition monitoring of rotating machinery

    NASA Astrophysics Data System (ADS)

    Jia, Xiaodong; Zhao, Ming; Di, Yuan; Li, Pin; Lee, Jay

    2018-03-01

    Sparsity is becoming a more and more important topic in the area of machine learning and signal processing recently. One big family of sparse measures in current literature is the generalized lp /lq norm, which is scale invariant and is widely regarded as normalized lp norm. However, the characteristics of the generalized lp /lq norm are still less discussed and its application to the condition monitoring of rotating devices has been still unexplored. In this study, we firstly discuss the characteristics of the generalized lp /lq norm for sparse optimization and then propose a method of sparse filtering with the generalized lp /lq norm for the purpose of impulsive signature enhancement. Further driven by the trend of industrial big data and the need of reducing maintenance cost for industrial equipment, the proposed sparse filter is customized for vibration signal processing and also implemented on bearing and gearbox for the purpose of condition monitoring. Based on the results from the industrial implementations in this paper, the proposed method has been found to be a promising tool for impulsive feature enhancement, and the superiority of the proposed method over previous methods is also demonstrated.

  17. Measuring general surgery residents' communication skills from the patient's perspective using the Communication Assessment Tool (CAT).

    PubMed

    Stausmire, Julie M; Cashen, Constance P; Myerholtz, Linda; Buderer, Nancy

    2015-01-01

    The Communication Assessment Tool (CAT) has been used and validated to assess Family and Emergency Medicine resident communication skills from the patient's perspective. However, it has not been previously reported as an outcome measure for general surgery residents. The purpose of this study is to establish initial benchmarking data for the use of the CAT as an evaluation tool in an osteopathic general surgery residency program. Results are analyzed quarterly and used by the program director to provide meaningful feedback and targeted goal setting for residents to demonstrate progressive achievement of interpersonal and communication skills with patients. The 14-item paper version of the CAT (developed by Makoul et al. for residency programs) asks patients to anonymously rate surgery residents on discrete communication skills using a 5-point rating scale immediately after the clinical encounter. Results are reported as the percentage of items rated as "excellent" (5) by the patient. The setting is a hospital-affiliated ambulatory urban surgery office staffed by the residency program. Participants are representative of adult patients of both sexes across all ages with diverse ethnic backgrounds. They include preoperative and postoperative patients, as well as those needing diagnostic testing and follow-up. Data have been collected on 17 general surgery residents from a single residency program representing 5 postgraduate year levels and 448 patient encounters since March 2012. The reliability (Cronbach α) of the tool for surgery residents was 0.98. The overall mean percentage of items rated as excellent was 70% (standard deviations = 42%), with a median of 100%. The CAT is a useful tool for measuring 1 facet of resident communication skills-the patient's perception of the physician-patient encounter. The tool provides a unique and personalized outcome measure for identifying communication strengths and improvement opportunities, allowing residents to receive specific feedback and mentoring by program directors. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  18. Family history tools in primary care: does one size fit all?

    PubMed

    Wilson, B J; Carroll, J C; Allanson, J; Little, J; Etchegary, H; Avard, D; Potter, B K; Castle, D; Grimshaw, J M; Chakraborty, P

    2012-01-01

    Family health history (FHH) has potential value in many health care settings. This review discusses the potential uses of FHH information in primary care and the need for tools to be designed accordingly. We developed a framework in which the attributes of FHH tools are mapped against these different purposes. It contains 7 attributes mapped against 5 purposes. In considering different FHH tool purposes, it is apparent that different attributes become more or less important, and that tools for different purposes require different implementation and evaluation strategies. The context in which a tool is used is also relevant to its effectiveness. For FHH tools, it is unlikely that 'one size fits all', although appreciation of different purposes, users and contexts should facilitate the development of different applications from single FHH platforms. Copyright © 2012 S. Karger AG, Basel.

  19. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  20. Computational Prediction of Protein-Protein Interactions

    PubMed Central

    Ehrenberger, Tobias; Cantley, Lewis C.; Yaffe, Michael B.

    2015-01-01

    The prediction of protein-protein interactions and kinase-specific phosphorylation sites on individual proteins is critical for correctly placing proteins within signaling pathways and networks. The importance of this type of annotation continues to increase with the continued explosion of genomic and proteomic data, particularly with emerging data categorizing posttranslational modifications on a large scale. A variety of computational tools are available for this purpose. In this chapter, we review the general methodologies for these types of computational predictions and present a detailed user-focused tutorial of one such method and computational tool, Scansite, which is freely available to the entire scientific community over the Internet. PMID:25859943

  1. DspaceOgre 3D Graphics Visualization Tool

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Myin, Steven; Pomerantz, Marc I.

    2011-01-01

    This general-purpose 3D graphics visualization C++ tool is designed for visualization of simulation and analysis data for articulated mechanisms. Examples of such systems are vehicles, robotic arms, biomechanics models, and biomolecular structures. DspaceOgre builds upon the open-source Ogre3D graphics visualization library. It provides additional classes to support the management of complex scenes involving multiple viewpoints and different scene groups, and can be used as a remote graphics server. This software provides improved support for adding programs at the graphics processing unit (GPU) level for improved performance. It also improves upon the messaging interface it exposes for use as a visualization server.

  2. HRLSim: a high performance spiking neural network simulator for GPGPU clusters.

    PubMed

    Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan

    2014-02-01

    Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.

  3. TAQL: A Problem Space Tool for Expert System Development.

    DTIC Science & Technology

    1992-05-01

    underlies Soar (Rosenbloom, Laird, Newell, and McCarl, 1991; Laird, Congdon , Altmann and Swedlow, 1990), a general- purpose intelligent architecture...select among multiple operators (or problem spaces or states), its default action is to use the selection space in a subgoal (Laird, Congdon , Altmann...during the experiments. "* TP : TAQL problems. There are bugs due to misunderstanding TAQL’s semantics. Subjects detected only one bug of this class

  4. Validation of catchment models for predicting land-use and climate change impacts. 1. Method

    NASA Astrophysics Data System (ADS)

    Ewen, J.; Parkin, G.

    1996-02-01

    Computer simulation models are increasingly being proposed as tools capable of giving water resource managers accurate predictions of the impact of changes in land-use and climate. Previous validation testing of catchment models is reviewed, and it is concluded that the methods used do not clearly test a model's fitness for such a purpose. A new generally applicable method is proposed. This involves the direct testing of fitness for purpose, uses established scientific techniques, and may be implemented within a quality assured programme of work. The new method is applied in Part 2 of this study (Parkin et al., J. Hydrol., 175:595-613, 1996).

  5. Topographica: Building and Analyzing Map-Level Simulations from Python, C/C++, MATLAB, NEST, or NEURON Components

    PubMed Central

    Bednar, James A.

    2008-01-01

    Many neural regions are arranged into two-dimensional topographic maps, such as the retinotopic maps in mammalian visual cortex. Computational simulations have led to valuable insights about how cortical topography develops and functions, but further progress has been hindered by the lack of appropriate tools. It has been particularly difficult to bridge across levels of detail, because simulators are typically geared to a specific level, while interfacing between simulators has been a major technical challenge. In this paper, we show that the Python-based Topographica simulator makes it straightforward to build systems that cross levels of analysis, as well as providing a common framework for evaluating and comparing models implemented in other simulators. These results rely on the general-purpose abstractions around which Topographica is designed, along with the Python interfaces becoming available for many simulators. In particular, we present a detailed, general-purpose example of how to wrap an external spiking PyNN/NEST simulation as a Topographica component using only a dozen lines of Python code, making it possible to use any of the extensive input presentation, analysis, and plotting tools of Topographica. Additional examples show how to interface easily with models in other types of simulators. Researchers simulating topographic maps externally should consider using Topographica's analysis tools (such as preference map, receptive field, or tuning curve measurement) to compare results consistently, and for connecting models at different levels. This seamless interoperability will help neuroscientists and computational scientists to work together to understand how neurons in topographic maps organize and operate. PMID:19352443

  6. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  7. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herner, K.; Alba Hernandex, A. F.; Bhat, S.

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasinglymore » complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specic third-party Certicate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.« less

  8. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specific third-party Certificate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.

  9. NOBLE - Flexible concept recognition for large-scale biomedical natural language processing.

    PubMed

    Tseytlin, Eugene; Mitchell, Kevin; Legowski, Elizabeth; Corrigan, Julia; Chavan, Girish; Jacobson, Rebecca S

    2016-01-14

    Natural language processing (NLP) applications are increasingly important in biomedical data analysis, knowledge engineering, and decision support. Concept recognition is an important component task for NLP pipelines, and can be either general-purpose or domain-specific. We describe a novel, flexible, and general-purpose concept recognition component for NLP pipelines, and compare its speed and accuracy against five commonly used alternatives on both a biological and clinical corpus. NOBLE Coder implements a general algorithm for matching terms to concepts from an arbitrary vocabulary set. The system's matching options can be configured individually or in combination to yield specific system behavior for a variety of NLP tasks. The software is open source, freely available, and easily integrated into UIMA or GATE. We benchmarked speed and accuracy of the system against the CRAFT and ShARe corpora as reference standards and compared it to MMTx, MGrep, Concept Mapper, cTAKES Dictionary Lookup Annotator, and cTAKES Fast Dictionary Lookup Annotator. We describe key advantages of the NOBLE Coder system and associated tools, including its greedy algorithm, configurable matching strategies, and multiple terminology input formats. These features provide unique functionality when compared with existing alternatives, including state-of-the-art systems. On two benchmarking tasks, NOBLE's performance exceeded commonly used alternatives, performing almost as well as the most advanced systems. Error analysis revealed differences in error profiles among systems. NOBLE Coder is comparable to other widely used concept recognition systems in terms of accuracy and speed. Advantages of NOBLE Coder include its interactive terminology builder tool, ease of configuration, and adaptability to various domains and tasks. NOBLE provides a term-to-concept matching system suitable for general concept recognition in biomedical NLP pipelines.

  10. MYRaf: A new Approach with IRAF for Astronomical Photometric Reduction

    NASA Astrophysics Data System (ADS)

    Kilic, Y.; Shameoni Niaei, M.; Özeren, F. F.; Yesilyaprak, C.

    2016-12-01

    In this study, the design and some developments of MYRaf software for astronomical photometric reduction are presented. MYRaf software is an easy to use, reliable, and has a fast IRAF aperture photometry GUI tools. MYRaf software is an important step for the automated software process of robotic telescopes, and uses IRAF, PyRAF, matplotlib, ginga, alipy, and Sextractor with the general-purpose and high-level programming language Python and uses the QT framework.

  11. Markets for Cybercrime Tools and Stolen Information: Hackers’ Bazaar

    DTIC Science & Technology

    2014-01-01

    Bitcoin . Others include Pecunix, AlertPay, PPcoin, Litecoin, Feathercoin, and Bitcoin extensions, such as Zerocoin. There is no consensus on which form...purpose of targeting wallets and bitcoins . It is difficult to assess trends for different products; product/price relationships can be quite nuanced and...for DDoS attacks against digital currencies (e.g., Bitcoin ) DDoS-for-hire services begin Renewed interest in DDoS-for-hire services General Spam

  12. OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.

    PubMed

    Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2013-02-15

    Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.

  13. Communication strategies and volunteer management for the IAU-OAD

    NASA Astrophysics Data System (ADS)

    Sankatsing Nava, Tibisay

    2015-08-01

    The IAU Office of Astronomy for Development will be developing a new communication strategy to promote its projects in a way that is relevant to stakeholders and the general public. Ideas include a magazine featuring best practices within the field of astronomy for development and setting up a workflow of communication that integrates the different outputs of the office and effectively uses the information collection tools developed by OAD team members.To accomplish these tasks the OAD will also develop a community management strategy with existing tools to effectively harness the skills of OAD volunteers for communication purposes. This talk will discuss the new communication strategy of the OAD as well the expanded community management plans.

  14. Modeling of Geometric Error in Linear Guide Way to Improved the vertical three-axis CNC Milling machine’s accuracy

    NASA Astrophysics Data System (ADS)

    Kwintarini, Widiyanti; Wibowo, Agung; Arthaya, Bagus M.; Yuwana Martawirya, Yatna

    2018-03-01

    The purpose of this study was to improve the accuracy of three-axis CNC Milling Vertical engines with a general approach by using mathematical modeling methods of machine tool geometric errors. The inaccuracy of CNC machines can be caused by geometric errors that are an important factor during the manufacturing process and during the assembly phase, and are factors for being able to build machines with high-accuracy. To improve the accuracy of the three-axis vertical milling machine, by knowing geometric errors and identifying the error position parameters in the machine tool by arranging the mathematical modeling. The geometric error in the machine tool consists of twenty-one error parameters consisting of nine linear error parameters, nine angle error parameters and three perpendicular error parameters. The mathematical modeling approach of geometric error with the calculated alignment error and angle error in the supporting components of the machine motion is linear guide way and linear motion. The purpose of using this mathematical modeling approach is the identification of geometric errors that can be helpful as reference during the design, assembly and maintenance stages to improve the accuracy of CNC machines. Mathematically modeling geometric errors in CNC machine tools can illustrate the relationship between alignment error, position and angle on a linear guide way of three-axis vertical milling machines.

  15. Using a Personal Development Plan for Different Purposes: Its Influence on Undertaking Learning Activities and Job Performance

    ERIC Educational Resources Information Center

    Beausaert, Simon A. J.; Segers, Mien S. R.; Gijselaers, Wim H.

    2011-01-01

    Today, organizations are increasingly implementing assessment tools such as Personal Development Plans. Although the true power of the tool lies in supporting the employee's continuing professional development, organizations implement the tool for various different purposes, professional development purposes on the one hand and promotion/salary…

  16. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    PubMed

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.

  17. Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikkel, D. J.; McCabe, J.

    This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less

  18. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    In a previous report the design concepts of Charon were presented. Charon is a toolkit that aids engineers in developing scientific programs for structured-grid applications to be run on MIMD parallel computers. It constitutes an augmentation of the general-purpose MPI-based message-passing layer, and provides the user with a hierarchy of tools for rapid prototyping and validation of parallel programs, and subsequent piecemeal performance tuning. Here we describe the implementation of the domain decomposition tools used for creating data distributions across sets of processors. We also present the hierarchy of parallelization tools that allows smooth translation of legacy code (or a serial design) into a parallel program. Along with the actual tool descriptions, we will present the considerations that led to the particular design choices. Many of these are motivated by the requirement that Charon must be useful within the traditional computational environments of Fortran 77 and C. Only the Fortran 77 syntax will be presented in this report.

  19. Applications of airborne remote sensing in atmospheric sciences research

    NASA Technical Reports Server (NTRS)

    Serafin, R. J.; Szejwach, G.; Phillips, B. B.

    1984-01-01

    This paper explores the potential for airborne remote sensing for atmospheric sciences research. Passive and active techniques from the microwave to visible bands are discussed. It is concluded that technology has progressed sufficiently in several areas that the time is right to develop and operate new remote sensing instruments for use by the community of atmospheric scientists as general purpose tools. Promising candidates include Doppler radar and lidar, infrared short range radiometry, and microwave radiometry.

  20. Application of Heart Rate Variability in Diagnosis and Prognosis of Individuals with Diabetes Mellitus: Systematic Review.

    PubMed

    França da Silva, Anne Kastelianne; Penachini da Costa de Rezende Barbosa, Marianne; Marques Vanderlei, Franciele; Destro Christofaro, Diego Giuliano; Marques Vanderlei, Luiz Carlos

    2016-05-01

    The use of heart rate variability as a tool capable of discriminating individuals with diabetes mellitus is still little explored, as its use has been limited to comparing those with and without the disease. Thus, the purpose of this study was to verify the use of heart rate variability as a tool for diagnostic and prognostic evaluation in person with diabetes and to identify whether there are cutoff points generated from the use of this tool in these individuals. A search was conducted in the electronic databases MEDLINE, Cochrane Library, Web of Science, EMBASE, and LILACS starting from the oldest records until January 2015, by means of descriptors related to the target condition, evaluated tool, and evaluation method. All the studies were evaluated for methodological quality using the QUADAS-2 instrument. Eight studies were selected. In general, the studies showed that the heart rate variability is useful to discriminate cardiac autonomic neuropathy in person with diabetes, and the sample entropy, SD1/SD2 indices, SDANN, HF, and slope of TFC have better discriminatory power to detect autonomic dysfunction, with sensitivity and specificity values ranging from 72% to 100% and 71% to 97%, respectively. Although there are methodological differences in indices used, in general, this tool demonstrated good sensitivity and specificity and can be used as an additional and/or complementary tool to the conventional autonomic tests, in order to obtain safer and more effective diagnostic, collaborating for better risk stratification conditions of these patients. © 2016 Wiley Periodicals, Inc.

  1. Detection of adverse events in general surgery using the " Trigger Tool" methodology.

    PubMed

    Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro

    2015-02-01

    Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  2. The water balance questionnaire: design, reliability and validity of a questionnaire to evaluate water balance in the general population.

    PubMed

    Malisova, Olga; Bountziouka, Vassiliki; Panagiotakos, Demosthenes B; Zampelas, Antonis; Kapsokefalou, Maria

    2012-03-01

    There is a need to develop a questionnaire as a research tool for the evaluation of water balance in the general population. The water balance questionnaire (WBQ) was designed to evaluate water intake from fluid and solid foods and drinking water, and water loss from urine, faeces and sweat at sedentary conditions and physical activity. For validation purposes, the WBQ was administrated in 40 apparently healthy participants aged 22-57 years (37.5% males). Hydration indices in urine (24 h volume, osmolality, specific gravity, pH, colour) were measured through established procedures. Furthermore, the questionnaire was administered twice to 175 subjects to evaluate its reliability. Kendall's τ-b and the Bland and Altman method were used to assess the questionnaire's validity and reliability. The proposed WBQ to assess water balance in healthy individuals was found to be valid and reliable, and it could thus be a useful tool in future projects that aim to evaluate water balance.

  3. Color postprocessing for 3-dimensional finite element mesh quality evaluation and evolving graphical workstation

    NASA Technical Reports Server (NTRS)

    Panthaki, Malcolm J.

    1987-01-01

    Three general tasks on general-purpose, interactive color graphics postprocessing for three-dimensional computational mechanics were accomplished. First, the existing program (POSTPRO3D) is ported to a high-resolution device. In the course of this transfer, numerous enhancements are implemented in the program. The performance of the hardware was evaluated from the point of view of engineering postprocessing, and the characteristics of future hardware were discussed. Second, interactive graphical tools implemented to facilitate qualitative mesh evaluation from a single analysis. The literature was surveyed and a bibliography compiled. Qualitative mesh sensors were examined, and the use of two-dimensional plots of unaveraged responses on the surface of three-dimensional continua was emphasized in an interactive color raster graphics environment. Finally, a postprocessing environment was designed for state-of-the-art workstation technology. Modularity, personalization of the environment, integration of the engineering design processes, and the development and use of high-level graphics tools are some of the features of the intended environment.

  4. MFCompress: a compression tool for FASTA and multi-FASTA data.

    PubMed

    Pinho, Armando J; Pratas, Diogo

    2014-01-01

    The data deluge phenomenon is becoming a serious problem in most genomic centers. To alleviate it, general purpose tools, such as gzip, are used to compress the data. However, although pervasive and easy to use, these tools fall short when the intention is to reduce as much as possible the data, for example, for medium- and long-term storage. A number of algorithms have been proposed for the compression of genomics data, but unfortunately only a few of them have been made available as usable and reliable compression tools. In this article, we describe one such tool, MFCompress, specially designed for the compression of FASTA and multi-FASTA files. In comparison to gzip and applied to multi-FASTA files, MFCompress can provide additional average compression gains of almost 50%, i.e. it potentially doubles the available storage, although at the cost of some more computation time. On highly redundant datasets, and in comparison with gzip, 8-fold size reductions have been obtained. Both source code and binaries for several operating systems are freely available for non-commercial use at http://bioinformatics.ua.pt/software/mfcompress/.

  5. Identification of Shiga-Toxigenic Escherichia coli outbreak isolates by a novel data analysis tool after matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Christner, Martin; Dressler, Dirk; Andrian, Mark; Reule, Claudia; Petrini, Orlando

    2017-01-01

    The fast and reliable characterization of bacterial and fungal pathogens plays an important role in infectious disease control and tracking of outbreak agents. DNA based methods are the gold standard for epidemiological investigations, but they are still comparatively expensive and time-consuming. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a fast, reliable and cost-effective technique now routinely used to identify clinically relevant human pathogens. It has been used for subspecies differentiation and typing, but its use for epidemiological tasks, e. g. for outbreak investigations, is often hampered by the complexity of data analysis. We have analysed publicly available MALDI-TOF mass spectra from a large outbreak of Shiga-Toxigenic Escherichia coli in northern Germany using a general purpose software tool for the analysis of complex biological data. The software was challenged with depauperate spectra and reduced learning group sizes to mimic poor spectrum quality and scarcity of reference spectra at the onset of an outbreak. With high quality formic acid extraction spectra, the software's built in classifier accurately identified outbreak related strains using as few as 10 reference spectra (99.8% sensitivity, 98.0% specificity). Selective variation of processing parameters showed impaired marker peak detection and reduced classification accuracy in samples with high background noise or artificially reduced peak counts. However, the software consistently identified mass signals suitable for a highly reliable marker peak based classification approach (100% sensitivity, 99.5% specificity) even from low quality direct deposition spectra. The study demonstrates that general purpose data analysis tools can effectively be used for the analysis of bacterial mass spectra.

  6. Utilizing social media for informal ocean conservation and education: The BioOceanography Project

    NASA Astrophysics Data System (ADS)

    Payette, J.

    2016-02-01

    Science communication through the use of social media is a rapidly evolving and growing pursuit in academic and scientific circles. Online tools and social media are being used in not only scientific communication but also scientific publication, education, and outreach. Standards and usage of social media as well as other online tools for communication, networking, outreach, and publication are always in development. Caution and a conservative attitude towards these novel "Science 2.0" tools is understandable because of their rapidly changing nature and the lack of professional standards for using them. However there are some key benefits and unique ways social media, online systems, and other Open or Open Source technologies, software, and "Science 2.0" tools can be utilized for academic purposes such as education and outreach. Diverse efforts for ocean conservation and education will continue to utilize social media for a variety of purposes. The BioOceanography project is an informal communication, education, outreach, and conservation initiative created for enhancing knowledge related to Oceanography and Marine Science with an unbiased yet conservation-minded approach and in an Open Source format. The BioOceanography project is ongoing and still evolving, but has already contributed to ocean education and conservation communication in key ways through a concerted web presence since 2013, including a curated Twitter account @_Oceanography and BioOceanography blog style website. Social media tools like those used in this project, if used properly can be highly effective and valuable for encouraging students, networking with researchers, and educating the general public in Oceanography.

  7. Pilot Trial of an Electronic Family Medical History in US Faith-Based Communities.

    PubMed

    Newcomb, Patricia; Canclini, Sharon; Cauble, Denise; Raudonis, Barbara; Golden, Paulette

    2014-07-01

    In spite of the acknowledged importance of collecting family health information, methods of collecting, organizing, and storage of pedigree data are not uniformly utilized in practice, though several electronic tools have been developed for the purpose. Using electronic tools to gather health information may empower individuals to take responsibility in managing their family health history. The purpose of this study was to describe the feasibility and outcomes of introducing small groups to the My Family Health Portrait tool in faith-based communities using faith community nurses (FCNs). This pilot project adopted a mixed methods approach to assess the potential of an educational intervention delivered by FCNs for increasing the use of electronic technologies for organizing and storing family health histories among the general public. Treatment and control groups were recruited from four faith-based communities in north Texas using a parallel-groups quasi-experimental design. Qualitative data were gleaned from field notes made by investigators interacting with FCNs and observing their teaching. A majority of respondents believed that knowing one's health history and passing it on to family and medical personnel is important. Those receiving face-to-face instruction on the electronic tool were significantly more likely to have written down family health information than the control group who received only an informational handout (χ(2) = 5.96, P = .015). Barriers to teaching about and using the electronic tool included FCNs' lack of facility with computers in the educational context and FCN and respondent mistrust of electronic storage for family health information. © The Author(s) 2014.

  8. An investigation of environmental and sustainability discourses associated with the substantive purposes of environmental assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rozema, Jaap G., E-mail: j.rozema@uea.ac.uk; Bond, Alan J., E-mail: alan.bond@uea.ac.uk; Cashmore, Matthew, E-mail: cashmore@plan.aau.dk

    2012-02-15

    This paper investigates the discursive construction of the substantive purposes of environmental assessment (EA). It addresses these purposes by exploring the complex and often multifaceted linkages between political factors and plural views of democracy, public participation, and the role of science that are embedded in environmental and sustainability discourses. The interaction between policy-making and public actors leads to the formulation of divergent and potentially competing rationales for public participation, and for social appraisal more generally. Participatory approaches have also given impetus to the development of several interpretations on the role of science in assessment procedures. Science is important in mediatingmore » public participation and the two are therefore reciprocally linked. This leads to discourses that become manifest in the construction of substantive purposes. Discourse analysis in EA is a relevant method for examining trends and patterns in sustainable development. It is argued that public participation is an important, if not decisive, variable in the articulation and civil legitimacy of certain purposes. A general proposition that results from this paper is that EA, although typically presented as an objective scientific tool, is an intrinsically normative process. Enhanced knowledge on the construction, and reconstruction over time, of substantive purposes is required if environmental and sustainability discourses are to be used and understood as meaningful analytical instruments to assess the socio-political implications of EA. - Highlights: Black-Right-Pointing-Pointer Substantive purposes related to environmental assessment may be best analyzed through discourse analysis. Black-Right-Pointing-Pointer Environmental and sustainability discourses are contingent on the level of participatory democracy and civic science. Black-Right-Pointing-Pointer Public participation is a decisive variable in the construction of the substantive purpose of environmental assessment.« less

  9. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less

  10. Conceptual Comparison of Population Based Metaheuristics for Engineering Problems

    PubMed Central

    Green, Paul

    2015-01-01

    Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes. PMID:25874265

  11. Conceptual comparison of population based metaheuristics for engineering problems.

    PubMed

    Adekanmbi, Oluwole; Green, Paul

    2015-01-01

    Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes.

  12. The value of the Semantic Web in the laboratory.

    PubMed

    Frey, Jeremy G

    2009-06-01

    The Semantic Web is beginning to impact on the wider chemical and physical sciences, beyond the earlier adopted bio-informatics. While useful in large-scale data driven science with automated processing, these technologies can also help integrate the work of smaller scale laboratories producing diverse data. The semantics aid the discovery, reliable re-use of data, provide improved provenance and facilitate automated processing by increased resilience to changes in presentation and reduced ambiguity. The Semantic Web, its tools and collections are not yet competitive with well-established solutions to current problems. It is in the reduced cost of instituting solutions to new problems that the versatility of Semantic Web-enabled data and resources will make their mark once the more general-purpose tools are more available.

  13. Re-Engineering JPL's Mission Planning Ground System Architecture for Cost Efficient Operations in the 21st Century

    NASA Technical Reports Server (NTRS)

    Fordyce, Jess

    1996-01-01

    Work carried out to re-engineer the mission analysis segment of JPL's mission planning ground system architecture is reported on. The aim is to transform the existing software tools, originally developed for specific missions on different support environments, into an integrated, general purpose, multi-mission tool set. The issues considered are: the development of a partnership between software developers and users; the definition of key mission analysis functions; the development of a consensus based architecture; the move towards evolutionary change instead of revolutionary replacement; software reusability, and the minimization of future maintenance costs. The current status and aims of new developments are discussed and specific examples of cost savings and improved productivity are presented.

  14. The adoption of social media and social media marketing by dentists in South Africa.

    PubMed

    Snyman, L; Visser, J H

    2014-07-01

    The purpose of the study was to identify and understand social media usage behaviour of dentists in South Africa, in general and in particular as part of their marketing strategy and to consider the potential determinants associated with these behaviours. Dentists who are members of the South African Dental Association were requested to anonymously complete an online questionnaire. Apart from demographic information, respondents were asked to report on their use of social media and their adoption of social media marketing. One-on-one interviews were also conducted with three dentists, to gain a deeper understanding of their adoption of this marketing option. South African dentists have started to embrace social media and 50.2% interact through these channels at least once a day. The most popular social media platforms are GooglePlus and Facebook. Respondents use social media mainly for personal purposes, including staying connected to family and friends.. Only 13.2% of those responding currently use social media as a marketing tool, but the majority (83.5%) predict that such usage will increase in future. Social media marketing is a growing trend and will become more significant in future. Although respondents used social media regularly for personal purposes, most are only now starting to use it as a marketing tool.

  15. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  16. Self-learning Monte Carlo method

    DOE PAGES

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; ...

    2017-01-04

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. Lastly, we demonstrate the efficiency of SLMC in a spin model at the phasemore » transition point, achieving a 10–20 times speedup.« less

  17. DRR is a teenager

    NASA Astrophysics Data System (ADS)

    Nagy, George

    2008-01-01

    The fifteenth anniversary of the first SPIE symposium (titled Character Recognition Technologies) on Document Recognition and Retrieval provides an opportunity to examine DRR's contributions to the development of document technologies. Many of the tools taken for granted today, including workable general purpose OCR, large-scale, semi-automatic forms processing, inter-format table conversion, and text mining, followed research presented at this venue. This occasion also affords an opportunity to offer tribute to the conference organizers and proceedings editors and to the coterie of professionals who regularly participate in DRR.

  18. ELAS: A powerful, general purpose image processing package

    NASA Technical Reports Server (NTRS)

    Walters, David; Rickman, Douglas

    1991-01-01

    ELAS is a software package which has been utilized as an image processing tool for more than a decade. It has been the source of several commercial packages. Now available on UNIX workstations it is a very powerful, flexible set of software. Applications at Stennis Space Center have included a very wide range of areas including medicine, forestry, geology, ecological modeling, and sonar imagery. It remains one of the most powerful image processing packages available, either commercially or in the public domain.

  19. The ARC/INFO geographic information system

    NASA Astrophysics Data System (ADS)

    Morehouse, Scott

    1992-05-01

    ARC/INFO is a general-purpose system for processing geographic information. It is based on a relatively simple model of geographic space—the coverage—and contains an extensive set of geoprocessing tools which operate on coverages. ARC/INFO is used in a wide variety of applications areas, including: natural-resource inventory and planning, cadastral database development and mapping, urban and regional planning, and cartography. This paper is an overview of ARC/INFO and discusses the ARC/INFO conceptual architecture, data model, operators, and user interface.

  20. Organization and use of a Software/Hardware Avionics Research Program (SHARP)

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.; Kareemi, M. N.

    1975-01-01

    The organization and use is described of the software/hardware avionics research program (SHARP) developed to duplicate the automatic portion of the STOLAND simulator system, on a general-purpose computer system (i.e., IBM 360). The program's uses are: (1) to conduct comparative evaluation studies of current and proposed airborne and ground system concepts via single run or Monte Carlo simulation techniques, and (2) to provide a software tool for efficient algorithm evaluation and development for the STOLAND avionics computer.

  1. Assessing pre-service science teachers' technological pedagogical content knowledge (TPACK) through observations and lesson plans

    NASA Astrophysics Data System (ADS)

    Canbazoglu Bilici, Sedef; Selcen Guzey, S.; Yamak, Havva

    2016-05-01

    Background: Technological pedagogical content knowledge (TPACK) is critical for effective teaching with technology. However, generally science teacher education programs do not help pre-service teachers develop TPACK. Purpose: The purpose of this study was to assess pre-service science teachers' TPACK over a semester-long Science Methods. Sample: Twenty-seven pre-service science teachers took the course toward the end of their four-year teacher education program. Design and method: The study employed the case study methodology. Lesson plans and microteaching observations were used as data collection tools. Technological Pedagogical Content Knowledge-based lesson plan assessment instrument (TPACK-LpAI) and Technological Pedagogical Content Knowledge Observation Protocol (TPACK-OP) were used to analyze data obtained from observations and lesson plans. Results: The results showed that the TPACK-focused Science Methods course had an impact on pre-service teachers' TPACK to varying degrees. Most importantly, the course helped teachers gain knowledge of effective usage of educational technology tools. Conclusion: Teacher education programs should provide opportunities to pre-service teachers to develop their TPACK so that they can effectively integrate technology into their teaching.

  2. NAIplot: An opensource web tool to visualize neuraminidase inhibitor (NAI) phenotypic susceptibility results using kernel density plots.

    PubMed

    Lytras, Theodore; Kossyvakis, Athanasios; Mentis, Andreas

    2016-02-01

    The results of neuraminidase inhibitor (NAI) enzyme inhibition assays are commonly expressed as 50% inhibitory concentration (IC50) fold-change values and presented graphically in box plots (box-and-whisker plots). An alternative and more informative type of graph is the kernel density plot, which we propose should be the preferred one for this purpose. In this paper we discuss the limitations of box plots and the advantages of the kernel density plot, and we present NAIplot, an opensource web application that allows convenient creation of density plots specifically for visualizing the results of NAI enzyme inhibition assays, as well as for general purposes. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  4. Scaling up close-range surveys, a challenge for the generalization of as-built data in industrial applications

    NASA Astrophysics Data System (ADS)

    Hullo, J.-F.; Thibault, G.

    2014-06-01

    As-built CAD data reconstructed from Terrestrial Laser Scanner (TLS) data are used for more than two decades by Electricité de France (EDF) to prepare maintenance operations in its facilities. But today, the big picture is renewed: "as-built virtual reality" must address a huge scale-up to provide data to an increasing number of applications. In this paper, we first present a wide multi-sensor multi-purpose scanning campaign performed in a 10 floor building of a power plant in 2013: 1083 TLS stations (about 40.109 3D points referenced under a 2 cm tolerance) and 1025 RGB panoramic images (340.106 pixels per point of view). As expected, this very large survey of high precision measurements in a complex environment stressed sensors and tools that were developed for more favourable conditions and smaller data sets. The whole survey process (tools and methods used from acquisition and processing to CAD reconstruction) underwent a detailed follow-up in order to state on the locks to a possible generalization to other buildings. Based on these recent feedbacks, we have highlighted some of these current bottlenecks in this paper: sensors denoising, automation in processes, data validation tools improvements, standardization of formats and (meta-) data structures.

  5. Optimising the use of observational electronic health record data: Current issues, evolving opportunities, strategies and scope for collaboration.

    PubMed

    Liaw, Siaw-Teng; Powell-Davies, Gawaine; Pearce, Christopher; Britt, Helena; McGlynn, Lisa; Harris, Mark F

    2016-03-01

    With increasing computerisation in general practice, national primary care networks are mooted as sources of data for health services and population health research and planning. Existing data collection programs - MedicinesInsight, Improvement Foundation, Bettering the Evaluation and Care of Health (BEACH) - vary in purpose, governance, methodologies and tools. General practitioners (GPs) have significant roles as collectors, managers and users of electronic health record (EHR) data. They need to understand the challenges to their clinical and managerial roles and responsibilities. The aim of this article is to examine the primary and secondary use of EHR data, identify challenges, discuss solutions and explore directions. Representatives from existing programs, Medicare Locals, Local Health Districts and research networks held workshops on the scope, challenges and approaches to the quality and use of EHR data. Challenges included data quality, interoperability, fragmented governance, proprietary software, transparency, sustainability, competing ethical and privacy perspectives, and cognitive load on patients and clinicians. Proposed solutions included effective change management; transparent governance and management of intellectual property, data quality, security, ethical access, and privacy; common data models, metadata and tools; and patient/community engagement. Collaboration and common approaches to tools, platforms and governance are needed. Processes and structures must be transparent and acceptable to GPs.

  6. Design and Evaluation of a Web-Based Symptom Monitoring Tool for Heart Failure.

    PubMed

    Wakefield, Bonnie J; Alexander, Gregory; Dohrmann, Mary; Richardson, James

    2017-05-01

    Heart failure is a chronic condition where symptom recognition and between-visit communication with providers are critical. Patients are encouraged to track disease-specific data, such as weight and shortness of breath. Use of a Web-based tool that facilitates data display in graph form may help patients recognize exacerbations and more easily communicate out-of-range data to clinicians. The purposes of this study were to (1) design a Web-based tool to facilitate symptom monitoring and symptom recognition in patients with chronic heart failure and (2) conduct a usability evaluation of the Web site. Patient participants generally had a positive view of the Web site and indicated it would support recording their health status and communicating with their doctors. Clinician participants generally had a positive view of the Web site and indicated it would be a potentially useful adjunct to electronic health delivery systems. Participants expressed a need to incorporate decision support within the site and wanted to add other data, for example, blood pressure, and have the ability to adjust font size. A few expressed concerns about data privacy and security. Technologies require careful design and testing to ensure they are useful, usable, and safe for patients and do not add to the burden of busy providers.

  7. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  8. Primer-BLAST: A tool to design target-specific primers for polymerase chain reaction

    PubMed Central

    2012-01-01

    Background Choosing appropriate primers is probably the single most important factor affecting the polymerase chain reaction (PCR). Specific amplification of the intended target requires that primers do not have matches to other targets in certain orientations and within certain distances that allow undesired amplification. The process of designing specific primers typically involves two stages. First, the primers flanking regions of interest are generated either manually or using software tools; then they are searched against an appropriate nucleotide sequence database using tools such as BLAST to examine the potential targets. However, the latter is not an easy process as one needs to examine many details between primers and targets, such as the number and the positions of matched bases, the primer orientations and distance between forward and reverse primers. The complexity of such analysis usually makes this a time-consuming and very difficult task for users, especially when the primers have a large number of hits. Furthermore, although the BLAST program has been widely used for primer target detection, it is in fact not an ideal tool for this purpose as BLAST is a local alignment algorithm and does not necessarily return complete match information over the entire primer range. Results We present a new software tool called Primer-BLAST to alleviate the difficulty in designing target-specific primers. This tool combines BLAST with a global alignment algorithm to ensure a full primer-target alignment and is sensitive enough to detect targets that have a significant number of mismatches to primers. Primer-BLAST allows users to design new target-specific primers in one step as well as to check the specificity of pre-existing primers. Primer-BLAST also supports placing primers based on exon/intron locations and excluding single nucleotide polymorphism (SNP) sites in primers. Conclusions We describe a robust and fully implemented general purpose primer design tool that designs target-specific PCR primers. Primer-BLAST offers flexible options to adjust the specificity threshold and other primer properties. This tool is publicly available at http://www.ncbi.nlm.nih.gov/tools/primer-blast. PMID:22708584

  9. KARMA: the observation preparation tool for KMOS

    NASA Astrophysics Data System (ADS)

    Wegner, Michael; Muschielok, Bernard

    2008-08-01

    KMOS is a multi-object integral field spectrometer working in the near infrared which is currently being built for the ESO VLT by a consortium of UK and German institutes. It is capable of selecting up to 24 target fields for integral field spectroscopy simultaneously by means of 24 robotic pick-off arms. For the preparation of observations with KMOS a dedicated preparation tool KARMA ("KMOS Arm Allocator") will be provided which optimizes the assignment of targets to these arms automatically, thereby taking target priorities and several mechanical and optical constraints into account. For this purpose two efficient algorithms, both being able to cope with the underlying optimization problem in a different way, were developed. We present the concept and architecture of KARMA in general and the optimization algorithms in detail.

  10. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  11. Transportable Applications Environment (TAE) Plus: A NASA tool used to develop and manage graphical user interfaces

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1992-01-01

    The Transportable Applications Environment (TAE) Plus was built to support the construction of graphical user interfaces (GUI's) for highly interactive applications, such as real-time processing systems and scientific analysis systems. It is a general purpose portable tool that includes a 'What You See Is What You Get' WorkBench that allows user interface designers to layout and manipulate windows and interaction objects. The WorkBench includes both user entry objects (e.g., radio buttons, menus) and data-driven objects (e.g., dials, gages, stripcharts), which dynamically change based on values of realtime data. Discussed here is what TAE Plus provides, how the implementation has utilized state-of-the-art technologies within graphic workstations, and how it has been used both within and without NASA.

  12. Validation, reliability, and specificity of CliniCom™ Psychiatric Assessment Software.

    PubMed

    Handal, Nelson; LePage, James; Dayley, Philip; Baldwin, Barbara; Roeser, Amellia; Kay, Joni; Theobald, Heather Ann; Nellamattathil, Michael; Drotar, Scott; Weir, Connor; Tindell, Neil; Tice, Kevin

    2018-07-01

    The purpose of this study was to determine the specificity and reproducibility of CliniCom™ Psychiatric Assessment Software to appropriately diagnose five prevalent mental health disorders. This online assessment tool incorporates proprietary algorithms for its propensity assessment. Unlike other questionnaires, which require a survey per specific mental disorder, CliniCom can simultaneously assess multiple mental disorders for an individual. CliniCom was concordant with other commonly used assessment tools in diagnosing five prevalent disorders including: Attention Deficit and Hyperactivity Disorder, Generalized Anxiety Disorder, Major Depressive Disorder, Obsessive Compulsive Disorder, and Social Phobia. The online tool was overall 78% concordant in diagnosing the same disorder during a test-retest analysis. When subjects exhibited two, three, or four disorders, the tool was less consistent in diagnosing the same set of disorders during the test-retest analysis (53% concordant). However, if evaluated as individual disorders within subjects, the more persistent disorders had a higher rate of concordance: MDD (83.3%), ADHD (81.0%), and OCD (68.4%). This study proposes CliniCom as an online assessment tool that demonstrates specificity in identifying specific psychiatric conditions and shows reproducibility over multiple administrations. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Tools for language: patterned iconicity in sign language nouns and verbs.

    PubMed

    Padden, Carol; Hwang, So-One; Lepic, Ryan; Seegers, Sharon

    2015-01-01

    When naming certain hand-held, man-made tools, American Sign Language (ASL) signers exhibit either of two iconic strategies: a handling strategy, where the hands show holding or grasping an imagined object in action, or an instrument strategy, where the hands represent the shape or a dimension of the object in a typical action. The same strategies are also observed in the gestures of hearing nonsigners identifying pictures of the same set of tools. In this paper, we compare spontaneously created gestures from hearing nonsigning participants to commonly used lexical signs in ASL. Signers and gesturers were asked to respond to pictures of tools and to video vignettes of actions involving the same tools. Nonsigning gesturers overwhelmingly prefer the handling strategy for both the Picture and Video conditions. Nevertheless, they use more instrument forms when identifying tools in pictures, and more handling forms when identifying actions with tools. We found that ASL signers generally favor the instrument strategy when naming tools, but when describing tools being used by an actor, they are significantly more likely to use more handling forms. The finding that both gesturers and signers are more likely to alternate strategies when the stimuli are pictures or video suggests a common cognitive basis for differentiating objects from actions. Furthermore, the presence of a systematic handling/instrument iconic pattern in a sign language demonstrates that a conventionalized sign language exploits the distinction for grammatical purpose, to distinguish nouns and verbs related to tool use. Copyright © 2014 Cognitive Science Society, Inc.

  14. A Delphi study assessing the utility of quality improvement tools and resources in Australian primary care.

    PubMed

    Upham, Susan J; Janamian, Tina; Crossland, Lisa; Jackson, Claire L

    2016-04-18

    To determine the relevance and utility of online tools and resources to support organisational performance development in primary care and to complement the Primary Care Practice Improvement Tool (PC-PIT). A purposively recruited Expert Advisory Panel of 12 end users used a modified Delphi technique to evaluate 53 tools and resources identified through a previously conducted systematic review. The panel comprised six practice managers and six general practitioners who had participated in the PC-PIT pilot study in 2013-2014. Tools and resources were reviewed in three rounds using a standard pre-tested assessment form. Recommendations, scores and reasons for recommending or rejecting each tool or resource were analysed to determine the final suite of tools and resources. The evaluation was conducted from November 2014 to August 2015. Recommended tools and resources scored highly (mean score, 16/20) in Rounds 1 and 2 of review (n = 25). These tools and resources were perceived to be easily used, useful to the practice and supportive of the PC-PIT. Rejected resources scored considerably lower (mean score, 5/20) and were noted to have limitations such as having no value to the practice and poor utility (n = 6). A final review (Round 3) of 28 resources resulted in a suite of 21 to support the elements of the PC-PIT. This suite of tools and resources offers one approach to supporting the quality improvement initiatives currently in development in primary care reform.

  15. Functional Investigations of HNF1A Identify Rare Variants as Risk Factors for Type 2 Diabetes in the General Population

    PubMed Central

    Najmi, Laeya Abdoli; Aukrust, Ingvild; Flannick, Jason; Molnes, Janne; Burtt, Noel; Molven, Anders; Groop, Leif; Altshuler, David; Johansson, Stefan; Njølstad, Pål Rasmus

    2017-01-01

    Variants in HNF1A encoding hepatocyte nuclear factor 1α (HNF-1A) are associated with maturity-onset diabetes of the young form 3 (MODY 3) and type 2 diabetes. We investigated whether functional classification of HNF1A rare coding variants can inform models of diabetes risk prediction in the general population by analyzing the effect of 27 HNF1A variants identified in well-phenotyped populations (n = 4,115). Bioinformatics tools classified 11 variants as likely pathogenic and showed no association with diabetes risk (combined minor allele frequency [MAF] 0.22%; odds ratio [OR] 2.02; 95% CI 0.73–5.60; P = 0.18). However, a different set of 11 variants that reduced HNF-1A transcriptional activity to <60% of normal (wild-type) activity was strongly associated with diabetes in the general population (combined MAF 0.22%; OR 5.04; 95% CI 1.99–12.80; P = 0.0007). Our functional investigations indicate that 0.44% of the population carry HNF1A variants that result in a substantially increased risk for developing diabetes. These results suggest that functional characterization of variants within MODY genes may overcome the limitations of bioinformatics tools for the purposes of presymptomatic diabetes risk prediction in the general population. PMID:27899486

  16. Shape optimization and CAD

    NASA Technical Reports Server (NTRS)

    Rasmussen, John

    1990-01-01

    Structural optimization has attracted the attention since the days of Galileo. Olhoff and Taylor have produced an excellent overview of the classical research within this field. However, the interest in structural optimization has increased greatly during the last decade due to the advent of reliable general numerical analysis methods and the computer power necessary to use them efficiently. This has created the possibility of developing general numerical systems for shape optimization. Several authors, eg., Esping; Braibant & Fleury; Bennet & Botkin; Botkin, Yang, and Bennet; and Stanton have published practical and successful applications of general optimization systems. Ding and Homlein have produced extensive overviews of available systems. Furthermore, a number of commercial optimization systems based on well-established finite element codes have been introduced. Systems like ANSYS, IDEAS, OASIS, and NISAOPT are widely known examples. In parallel to this development, the technology of computer aided design (CAD) has gained a large influence on the design process of mechanical engineering. The CAD technology has already lived through a rapid development driven by the drastically growing capabilities of digital computers. However, the systems of today are still considered as being only the first generation of a long row of computer integrated manufacturing (CIM) systems. These systems to come will offer an integrated environment for design, analysis, and fabrication of products of almost any character. Thus, the CAD system could be regarded as simply a database for geometrical information equipped with a number of tools with the purpose of helping the user in the design process. Among these tools are facilities for structural analysis and optimization as well as present standard CAD features like drawing, modeling, and visualization tools. The state of the art of structural optimization is that a large amount of mathematical and mechanical techniques are available for the solution of single problems. By implementing collections of the available techniques into general software systems, operational environments for structural optimization have been created. The forthcoming years must bring solutions to the problem of integrating such systems into more general design environments. The result of this work should be CAD systems for rational design in which structural optimization is one important design tool among many others.

  17. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    PubMed

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  18. Affordances of Telecollaboration Tools for English for Specific Purposes Online Learning

    ERIC Educational Resources Information Center

    Sevilla-Pavón, Ana

    2016-01-01

    This paper explores students' perceptions of the affordances of different telecollaboration tools used in an innovation project for English for Specific Purposes online learning carried out between the University of Valencia (Spain) and Wofford College (South Carolina, United States) during the school year 2015-2016. Different tools for…

  19. A Automated Tool for Supporting FMEAs of Digital Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less

  20. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or the SDDP methods. The independent use of surface and groundwater can be examined with and without the aquifer. The ESPAT_DET, ESPATR and ESPAT_SDP modules were executed for the surface system, while the ESPAT_RA and the ESPAT_DET modules were run for the surface-groundwater system. The surface system's results show a similar performance between the ESPAT_SDP and ESPATR modules, with outperform the one showed by the current policies besides being outperformed by the ESPAT_DET results, which have the advantage of the perfect foresight. The surface-groundwater system's results show a robust situation in which the differences between the module's results and the current policies are lower due the use of pumped groundwater in the XX century crops when surface water is scarce. The results are realistic, with the deterministic optimization outperforming the stochastic one, which at the same time outperforms the current policies; showing that the tool is able to stochastically optimize river-aquifer water resources systems. We are currently working in the application of these tools in the analysis of changes in systems' operation under global change conditions. ACKNOWLEDGEMENT: This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) funds.

  1. A General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.; Shaklan, Stuart B.

    2009-01-01

    This paper describes a general purpose Coronagraph Performance Error Budget (CPEB) tool that we have developed under the NASA Exoplanet Exploration Program. The CPEB automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. It operates in 3 steps: first, a CodeV or Zemax prescription is converted into a MACOS optical prescription. Second, a Matlab program calls ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled coarse and fine-steering mirrors. Third, the sensitivity matrices are imported by macros into Excel 2007 where the error budget is created. Once created, the user specifies the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions and combines them with the sensitivity matrices to generate an error budget for the system. The user can easily modify the motion allocations to perform trade studies.

  2. MCNP Version 6.2 Release Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.

    Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less

  3. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE PAGES

    Khodak, Andrei

    2017-08-21

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  4. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodak, Andrei

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  5. A new tool to assess compliance of mental health laws with the convention on the rights of persons with disabilities.

    PubMed

    Byrne, Marion; White, Ben; McDonald, Fiona

    Since the introduction of the Convention on the Rights of Persons with Disabilities (2006) (CRPD), there have been calls to establish standards to measure compliance of domestic mental health laws with the human rights outlined in the CRPD. This article aims to address this gap by proposing a tool: the Analysis Instrument for Mental health (AIM). In particular, the tool's purpose is to enable states and civil society to assess the compliance of non-forensic domestic mental health laws with Article 12 of the CRPD. It responds to Dawson's (2015) call for a mechanism designed to provide clear and measurable standards for which to undertake this exercise. The content of AIM draws directly from the authoritative interpretation of Article 12 provided by the United Nations Committee on the Rights of Persons with Disabilities (the Committee) in its General Comment, as well as the substantial body of academic and other literature about Article 12. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    PubMed

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  7. ReProTool Version 2.0: Re-Engineering Academic Curriculum Using Learning Outcomes, ECTS and Bologna Process Concepts

    ERIC Educational Resources Information Center

    Pouyioutas, Philippos; Gjermundrod, Harald; Dionysiou, Ioanna

    2012-01-01

    Purpose: The purpose of this paper is to present ReProTool Version 2.0, a software tool that is used for the European Credit Transfer System (ECTS) and the Bologna Process re-engineering of academic programmes. The tool is the result of an 18 months project (February 2012-July 2013) project, co-financed by the European Regional Development Fund…

  8. Web Usage Mining Analysis of Federated Search Tools for Egyptian Scholars

    ERIC Educational Resources Information Center

    Mohamed, Khaled A.; Hassan, Ahmed

    2008-01-01

    Purpose: This paper aims to examine the behaviour of the Egyptian scholars while accessing electronic resources through two federated search tools. The main purpose of this article is to provide guidance for federated search tool technicians and support teams about user issues, including the need for training. Design/methodology/approach: Log…

  9. Study of launch site processing and facilities for future launch vehicles

    NASA Astrophysics Data System (ADS)

    Shaffer, Rex

    1995-03-01

    The purpose of this research is to provide innovative and creative approaches to assess the impact to the Kennedy Space Center and other launch sites for a range of candidate manned and unmanned space transportation systems. The general scope of the research includes the engineering activities, analyses, and evaluations defined in the four tasks below: (1) development of innovative approaches and computer aided tools; (2) operations analyses of launch vehicle concepts and designs; (3) assessment of ground operations impacts; and (4) development of methodologies to identify promising technologies.

  10. A SCILAB Program for Computing General-Relativistic Models of Rotating Neutron Stars by Implementing Hartle's Perturbation Method

    NASA Astrophysics Data System (ADS)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement Hartle's perturbation method to the computation of relativistic rigidly rotating neutron star models. The program has been written in SCILAB (© INRIA ENPC), a matrix-oriented high-level programming language. The numerical method is described in very detail and is applied to many models in slow or fast rotation. We show that, although the method is perturbative, it gives accurate results for all practical purposes and it should prove an efficient tool for computing rapidly rotating pulsars.

  11. Nuflood, Version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tasseff, Byron

    2016-07-29

    NUFLOOD Version 1.x is a surface-water hydrodynamic package designed for the simulation of overland flow of fluids. It consists of various routines to address a wide range of applications (e.g., rainfall-runoff, tsunami, storm surge) and real time, interactive visualization tools. NUFLOOD has been designed for general-purpose computers and workstations containing multi-core processors and/or graphics processing units. The software is easy to use and extensible, constructed in mind for instructors, students, and practicing engineers. NUFLOOD is intended to assist the water resource community in planning against water-related natural disasters.

  12. Study of launch site processing and facilities for future launch vehicles

    NASA Technical Reports Server (NTRS)

    Shaffer, Rex

    1995-01-01

    The purpose of this research is to provide innovative and creative approaches to assess the impact to the Kennedy Space Center and other launch sites for a range of candidate manned and unmanned space transportation systems. The general scope of the research includes the engineering activities, analyses, and evaluations defined in the four tasks below: (1) development of innovative approaches and computer aided tools; (2) operations analyses of launch vehicle concepts and designs; (3) assessment of ground operations impacts; and (4) development of methodologies to identify promising technologies.

  13. Gaia archive

    NASA Astrophysics Data System (ADS)

    Hypki, Arkadiusz; Brown, Anthony

    2016-06-01

    The Gaia archive is being designed and implemented by the DPAC Consortium. The purpose of the archive is to maximize the scientific exploitation of the Gaia data by the astronomical community. Thus, it is crucial to gather and discuss with the community the features of the Gaia archive as much as possible. It is especially important from the point of view of the GENIUS project to gather the feedback and potential use cases for the archive. This paper presents very briefly the general ideas behind the Gaia archive and presents which tools are already provided to the community.

  14. BioImageXD: an open, general-purpose and high-throughput image-processing platform.

    PubMed

    Kankaanpää, Pasi; Paavolainen, Lassi; Tiitta, Silja; Karjalainen, Mikko; Päivärinne, Joacim; Nieminen, Jonna; Marjomäki, Varpu; Heino, Jyrki; White, Daniel J

    2012-06-28

    BioImageXD puts open-source computer science tools for three-dimensional visualization and analysis into the hands of all researchers, through a user-friendly graphical interface tuned to the needs of biologists. BioImageXD has no restrictive licenses or undisclosed algorithms and enables publication of precise, reproducible and modifiable workflows. It allows simple construction of processing pipelines and should enable biologists to perform challenging analyses of complex processes. We demonstrate its performance in a study of integrin clustering in response to selected inhibitors.

  15. A security vulnerabilities assessment tool for interim storage facilities of low-level radioactive wastes.

    PubMed

    Bible, J; Emery, R J; Williams, T; Wang, S

    2006-11-01

    Limited permanent low-level radioactive waste (LLRW) disposal capacity and correspondingly high disposal costs have resulted in the creation of numerous interim storage facilities for either decay-in-storage operations or longer term accumulation efforts. These facilities, which may be near the site of waste generation or in distal locations, often were not originally designed for the purpose of LLRW storage, particularly with regard to security. Facility security has become particularly important in light of the domestic terrorist acts of 2001, wherein LLRW, along with many other sources of radioactivity, became recognized commodities to those wishing to create disruption through the purposeful dissemination of radioactive materials. Since some LLRW materials may be in facilities that may exhibit varying degrees of security control sophistication, a security vulnerabilities assessment tool grounded in accepted criminal justice theory and security practice has been developed. The tool, which includes dedicated sections on general security, target hardening, criminalization benefits, and the presence of guardians, can be used by those not formally schooled in the security profession to assess the level of protection afforded to their respective facilities. The tool equips radiation safety practitioners with the ability to methodically and systematically assess the presence or relative status of various facility security aspects, many of which may not be considered by individuals from outside the security profession. For example, radiation safety professionals might not ordinarily consider facility lighting aspects, which is a staple for the security profession since it is widely known that crime disproportionately occurs more frequently at night or in poorly lit circumstances. Likewise, the means and associated time dimensions for detecting inventory discrepancies may not be commonly considered. The tool provides a simple means for radiation safety professionals to assess, and perhaps enhance in a reasonable fashion, the security of their interim storage operations. Aspects of the assessment tool can also be applied to other activities involving the protection of sources of radiation as well.

  16. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  17. Education for Sustainable Development and Global Citizenship: An Evaluation of the Validity of the STAUNCH Auditing Tool

    ERIC Educational Resources Information Center

    Glover, Alison; Peters, Carl; Haslett, Simon K.

    2011-01-01

    Purpose: The purpose of this paper is to test the validity of the curriculum auditing tool Sustainability Tool for Auditing University Curricula in Higher Education (STAUNCH[C]), which was designed to audit the education for sustainability and global citizenship content of higher education curricula. The Welsh Assembly Government aspires to…

  18. [Program of studies on psychiatric epidemiology in Argentina. General report].

    PubMed

    Casullo, M M

    1980-12-01

    This paper is an outline of a wide program that is currently under development in the large territory of Argentina. The Director of the Program is Dr. Fernando Pagés Larraya; it is supported by the National Council of Scientific Researches (CONICET) and the National Board of Mental Health. The general purpose of the program is to study the prevalence of mental disorders in different ethnographic areas within the country. Epidemiology allows the forecasting of disease occurence. A research work this area may be qualified "effective" if it provides useful data for prevention programs. Therefore, it is necessary that researches and professional responsibles of Mental Health Governmental decissions work together. This rapprochment is being attempted in developing the Argentine research program. It has a cross-cultural approach. It can be called "a way of thinking" as opposed to a precise methodology. A considerable variety of research tools are being used, depending on the specific purposes and the characteristics of the ethnographic areas. One of the main difficulties in choosing a technique for "case-finding" is uncertainty about where to place the "cut-off point" between presence and absence of illness. In this program the Present State Examination (PSE) is used in population surveys of large urban centers. It is a semi-structured interview that has been extensively tested. In small rural communities, the work is done using "key-informants" and applying the snow-ball sample technique. One specific purpose of the research is the study of the modal personality structure in each ethnographic area, formulated in terms of the Holtzman Inkblot Test. The paper shows the relationships between purposes, research tools and responsible professionals. There is hardly time or surplus intellectual energy for polemic and alienation between clinicians and social scientists. Theories, methodologies, research data and prevention programs have not developed harmoniously. We need to carry out research work not divorced from Public Health responsible authorities, in order to avoid that useful data from Epidemiological Studies will not be used in prevention programs.

  19. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    PubMed Central

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.

    2016-01-01

    Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387

  20. A Science Products Inventory for Citizen-Science Planning and Evaluation

    PubMed Central

    Wiggins, Andrea; Bonney, Rick; LeBuhn, Gretchen; Parrish, Julia K; Weltzin, Jake F

    2018-01-01

    Abstract Citizen science involves a range of practices involving public participation in scientific knowledge production, but outcomes evaluation is complicated by the diversity of the goals and forms of citizen science. Publications and citations are not adequate metrics to describe citizen-science productivity. We address this gap by contributing a science products inventory (SPI) tool, iteratively developed through an expert panel and case studies, intended to support general-purpose planning and evaluation of citizen-science projects with respect to science productivity. The SPI includes a collection of items for tracking the production of science outputs and data practices, which are described and illustrated with examples. Several opportunities for further development of the initial inventory are highlighted, as well as potential for using the inventory as a tool to guide project management, funding, and research on citizen science. PMID:29867254

  1. A Science Products Inventory for Citizen-Science Planning and Evaluation

    PubMed Central

    Wiggins, Andrea; Bonney, Rick; LeBuhn, Gretchen; Parrish, Julia K; Weltzin, Jake F

    2018-01-01

    Abstract Citizen science involves a range of practices involving public participation in scientific knowledge production, but outcomes evaluation is complicated by the diversity of the goals and forms of citizen science. Publications and citations are not adequate metrics to describe citizen-science productivity. We address this gap by contributing a science products inventory (SPI) tool, iteratively developed through an expert panel and case studies, intended to support general-purpose planning and evaluation of citizen-science projects with respect to science productivity. The SPI includes a collection of items for tracking the production of science outputs and data practices, which are described and illustrated with examples. Several opportunities for further development of the initial inventory are highlighted, as well as potential for using the inventory as a tool to guide project management, funding, and research on citizen science. PMID:29867253

  2. BEANS - a software package for distributed Big Data analysis

    NASA Astrophysics Data System (ADS)

    Hypki, Arkadiusz

    2018-07-01

    BEANS software is a web-based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of data sets. Its main purpose is to simplify the process of storing, examining, and finding new relations in huge data sets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse, and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open-source software too.

  3. A science products inventory for citizen-science planning and evaluation

    USGS Publications Warehouse

    Wiggins, Andrea; Bonney, Rick; LeBuhn, Gretchen; Parrish, Julia K.; Weltzin, Jake F.

    2018-01-01

    Citizen science involves a range of practices involving public participation in scientific knowledge production, but outcomes evaluation is complicated by the diversity of the goals and forms of citizen science. Publications and citations are not adequate metrics to describe citizen-science productivity. We address this gap by contributing a science products inventory (SPI) tool, iteratively developed through an expert panel and case studies, intended to support general-purpose planning and evaluation of citizen-science projects with respect to science productivity. The SPI includes a collection of items for tracking the production of science outputs and data practices, which are described and illustrated with examples. Several opportunities for further development of the initial inventory are highlighted, as well as potential for using the inventory as a tool to guide project management, funding, and research on citizen science.

  4. Wind Sensing, Analysis, and Modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch system operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided. Current versions of prototype Process Management Environment tools are being provided to the customer.

  5. Wind sensing, analysis, and modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch systems operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided . Current versions of prototype Process Management Environment tools are being provided to the customer.

  6. A Science Products Inventory for Citizen-Science Planning and Evaluation.

    PubMed

    Wiggins, Andrea; Bonney, Rick; LeBuhn, Gretchen; Parrish, Julia K; Weltzin, Jake F

    2018-06-01

    Citizen science involves a range of practices involving public participation in scientific knowledge production, but outcomes evaluation is complicated by the diversity of the goals and forms of citizen science. Publications and citations are not adequate metrics to describe citizen-science productivity. We address this gap by contributing a science products inventory (SPI) tool, iteratively developed through an expert panel and case studies, intended to support general-purpose planning and evaluation of citizen-science projects with respect to science productivity. The SPI includes a collection of items for tracking the production of science outputs and data practices, which are described and illustrated with examples. Several opportunities for further development of the initial inventory are highlighted, as well as potential for using the inventory as a tool to guide project management, funding, and research on citizen science.

  7. BEANS - a software package for distributed Big Data analysis

    NASA Astrophysics Data System (ADS)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  8. Evaluation of the methodological quality of the Health Protection Agency's 2009 guidance on neuraminidase inhibitors.

    PubMed

    Hopayian, Kevork; Jackson, Lucy

    2012-01-01

    The Health Protection Agency (HPA) issued guidance advocating the prescription of neuraminidase inhibitors in July 2009 in response to a predicted pandemic of influenza. Although the contents of the guidance have been debated, the methodology has not. The guidance was evaluated by two reviewers using a validated and internationally recognised tool for assessing guidelines, the Appraisal of Guidelines Research & Evaluation instrument (AGREE). This tool scores six domains independently of each other. The guidance scored 61% for the domain scope and purpose and 54% for the domain clarity and presentation. By contrast, it scored only 31% for rigour of development due to poor linkage of its recommendations to evidence. The HPA should improve its performance in this domain to general practitioners in order to improve the credibility of its future guidance.

  9. A psychometric evaluation of the digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-10-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.

  10. Teamwork Assessment Tools in Obstetric Emergencies: A Systematic Review.

    PubMed

    Onwochei, Desire N; Halpern, Stephen; Balki, Mrinalini

    2017-06-01

    Team-based training and simulation can improve patient safety, by improving communication, decision making, and performance of team members. Currently, there is no general consensus on whether or not a specific assessment tool is better adapted to evaluate teamwork in obstetric emergencies. The purpose of this qualitative systematic review was to find the tools available to assess team effectiveness in obstetric emergencies. We searched Embase, Medline, PubMed, Web of Science, PsycINFO, CINAHL, and Google Scholar for prospective studies that evaluated nontechnical skills in multidisciplinary teams involving obstetric emergencies. The search included studies from 1944 until January 11, 2016. Data on reliability and validity measures were collected and used for interpretation. A descriptive analysis was performed on the data. Thirteen studies were included in the final qualitative synthesis. All the studies assessed teams in the context of obstetric simulation scenarios, but only six included anesthetists in the simulations. One study evaluated their teamwork tool using just validity measures, five using just reliability measures, and one used both. The most reliable tools identified were the Clinical Teamwork Scale, the Global Assessment of Obstetric Team Performance, and the Global Rating Scale of performance. However, they were still lacking in terms of quality and validity. More work needs to be conducted to establish the validity of teamwork tools for nontechnical skills, and the development of an ideal tool is warranted. Further studies are required to assess how outcomes, such as performance and patient safety, are influenced when using these tools.

  11. Are joint health plans effective for coordination of health services? An analysis based on theory and Danish pre-reform results

    PubMed Central

    Strandberg-Larsen, Martin; Bernt Nielsen, Mikkel; Krasnik, Allan

    2007-01-01

    Background Since 1994 formal health plans have been used for coordination of health care services between the regional and local level in Denmark. From 2007 a substantial reform has changed the administrative boundaries of the system and a new tool for coordination has been introduced. Purpose To assess the use of the pre-reform health plans as a tool for strengthening coordination, quality and preventive efforts between the regional and local level of health care. Methods A survey addressed to: all counties (n=15), all municipalities (n=271) and a randomised selected sample of general practitioners (n=700). Results The stakeholders at the administrative level agree that health plans have not been effective as a tool for coordination. The development of health plans are dominated by the regional level. At the functional level 27 percent of the general practitioners are not familiar with health plans. Among those familiar with health plans 61 percent report that health plans influence their work to only a lesser degree or not at all. Conclusion Joint health planning is needed to achieve coordination of care. Efforts must be made to overcome barriers hampering efficient whole system planning. Active policies emphasising the necessity of health planning, despite involved cost, are warranted to insure delivery of care that benefits the health of the population. PMID:17925882

  12. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    PubMed Central

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807

  13. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis.

    PubMed

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-09-17

    Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.

  14. An Ontology for Requesting Distant Robotic Action: A Case Study in Naming and Action Identification for Planning on the Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Wales, Roxana C.; Shalin, Valerie L.; Bass, Deborah S.

    2004-01-01

    This paper focuses on the development and use of the abbreviated names as well as an emergent ontology associated with making requests for action of a distant robotic rover during the 2003-2004 NASA Mars Exploration Rover (MER) mission, run by the Jet Propulsion Laboratory. The infancy of the domain of Martian telerobotic science, in which specialists request work from a rover moving through the landscape, as well as the need to consider the interdisciplinary teams involved in the work required an empirical approach. The formulation of this ontology is grounded in human behavior and work practice. The purpose of this paper is to identify general issues for an ontology of action (specifically for requests for action), while maintaining sensitivity to the users, tools and the work system within a specific technical domain. We found that this ontology of action must take into account a dynamic environment, changing in response to the movement of the rover, changes on the rover itself, as well as be responsive to the purposeful intent of the science requestors. Analysis of MER mission events demonstrates that the work practice and even robotic tool usage changes over time. Therefore, an ontology must adapt and represent both incremental change and revolutionary change, and the ontology can never be more than a partial agreement on the conceptualizations involved. Although examined in a rather unique technical domain, the general issues pertain to the control of any complex, distributed work system as well as the archival record of its accomplishments.

  15. A General-Purpose Optimization Engine for Multi-Disciplinary Design Applications

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A general purpose optimization tool for multidisciplinary applications, which in the literature is known as COMETBOARDS, is being developed at NASA Lewis Research Center. The modular organization of COMETBOARDS includes several analyzers and state-of-the-art optimization algorithms along with their cascading strategy. The code structure allows quick integration of new analyzers and optimizers. The COMETBOARDS code reads input information from a number of data files, formulates a design as a set of multidisciplinary nonlinear programming problems, and then solves the resulting problems. COMETBOARDS can be used to solve a large problem which can be defined through multiple disciplines, each of which can be further broken down into several subproblems. Alternatively, a small portion of a large problem can be optimized in an effort to improve an existing system. Some of the other unique features of COMETBOARDS include design variable formulation, constraint formulation, subproblem coupling strategy, global scaling technique, analysis approximation, use of either sequential or parallel computational modes, and so forth. The special features and unique strengths of COMETBOARDS assist convergence and reduce the amount of CPU time used to solve the difficult optimization problems of aerospace industries. COMETBOARDS has been successfully used to solve a number of problems, including structural design of space station components, design of nozzle components of an air-breathing engine, configuration design of subsonic and supersonic aircraft, mixed flow turbofan engines, wave rotor topped engines, and so forth. This paper introduces the COMETBOARDS design tool and its versatility, which is illustrated by citing examples from structures, aircraft design, and air-breathing propulsion engine design.

  16. A Tool for the Administration and Management of University Profile Information

    ERIC Educational Resources Information Center

    Bulchand, Jacques; Rodriguez, Jorge; Chattah, Ana C.

    2005-01-01

    Purpose: The purpose of this paper is to present a management tool that helps to achieve the objectives of the plan for info-tech systems and communications of the University of Las Palmas de Gran Canaria for the 2003-2006 period. Design/methodology/approach: The methodology used in this case is nothing if not practical. The chosen tool involved…

  17. Talking Online: Reflecting on Online Communication Tools

    ERIC Educational Resources Information Center

    Greener, Susan

    2009-01-01

    Purpose: The purpose of this paper is to reflect on the value and constraints of varied online communication tools from web 2.0 to e-mail in a higher education (HE) teaching and learning context, where these tools are used to support or be the main focus of learning. Design/methodology/approach: A structured reflection is produced with the aid of…

  18. A Review of Tools and Techniques for Data-Enabled Formative Assessment

    ERIC Educational Resources Information Center

    Nyland, Rob

    2018-01-01

    The purpose of this literature review is to understand the current state of research on tools that collect data for the purpose of formative assessment. We were interested in identifying the types of data collected by these tools, how these data were processed, and how the processed data were presented to the instructor or student for the purpose…

  19. Self-Learning Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of general and efficient update algorithm for large size systems close to phase transition or with strong frustrations, for which local updates perform badly. In this work, we propose a new general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup. This work is supported by the DOE Office of Basic Energy Sciences, Division of Materials Sciences and Engineering under Award DE-SC0010526.

  20. Microengineering in cardiovascular research: new developments and translational applications.

    PubMed

    Chan, Juliana M; Wong, Keith H K; Richards, Arthur Mark; Drum, Chester L

    2015-04-01

    Microfluidic, cellular co-cultures that approximate macro-scale biology are important tools for refining the in vitro study of organ-level function and disease. In recent years, advances in technical fabrication and biological integration have provided new insights into biological phenomena, improved diagnostic measurements, and made major steps towards de novo tissue creation. Here we review applications of these technologies specific to the cardiovascular field, emphasizing three general categories of use: reductionist vascular models, tissue-engineered vascular models, and point-of-care diagnostics. With continued progress in the ability to purposefully control microscale environments, the detailed study of both primary and cultured cells may find new relevance in the general cardiovascular research community. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.

  1. Atomicrex—a general purpose tool for the construction of atomic interaction models

    NASA Astrophysics Data System (ADS)

    Stukowski, Alexander; Fransson, Erik; Mock, Markus; Erhart, Paul

    2017-07-01

    We introduce atomicrex, an open-source code for constructing interatomic potentials as well as more general types of atomic-scale models. Such effective models are required to simulate extended materials structures comprising many thousands of atoms or more, because electronic structure methods become computationally too expensive at this scale. atomicrex covers a wide range of interatomic potential types and fulfills many needs in atomistic model development. As inputs, it supports experimental property values as well as ab initio energies and forces, to which models can be fitted using various optimization algorithms. The open architecture of atomicrex allows it to be used in custom model development scenarios beyond classical interatomic potentials while thanks to its Python interface it can be readily integrated e.g., with electronic structure calculations or machine learning algorithms.

  2. Data Assimilation as a Tool for Developing a Mars International Reference Atmosphere

    NASA Technical Reports Server (NTRS)

    Houben, Howard

    2005-01-01

    A new paradigm for a Mars International Reference Atmosphere is proposed. In general, as is certainly now the case for Mars, there are sufficient observational data to specify what the full atmospheric state was under a variety of circumstances (season, dustiness, etc.). There are also general circulation models capable of deter- mining the evolution of these states. If these capabilities are combined-using data assimilation techniques-the resulting analyzed states can be probed to answer a wide variety of questions, whether posed by scientists, mission planners, or others. This system would fulfill all the purposes of an international reference atmosphere and would make the scientific results of exploration missions readily available to the community. Preliminary work on a website that would incorporate this functionality has begun.

  3. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  4. Use of administrative medical databases in population-based research.

    PubMed

    Gavrielov-Yusim, Natalie; Friger, Michael

    2014-03-01

    Administrative medical databases are massive repositories of data collected in healthcare for various purposes. Such databases are maintained in hospitals, health maintenance organisations and health insurance organisations. Administrative databases may contain medical claims for reimbursement, records of health services, medical procedures, prescriptions, and diagnoses information. It is clear that such systems may provide a valuable variety of clinical and demographic information as well as an on-going process of data collection. In general, information gathering in these databases does not initially presume and is not planned for research purposes. Nonetheless, administrative databases may be used as a robust research tool. In this article, we address the subject of public health research that employs administrative data. We discuss the biases and the limitations of such research, as well as other important epidemiological and biostatistical key points specific to administrative database studies.

  5. Design and simulation of EVA tools for first servicing mission of HST

    NASA Technical Reports Server (NTRS)

    Naik, Dipak; Dehoff, P. H.

    1994-01-01

    The Hubble Space Telescope (HST) was launched into near-earth orbit by the Space Shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A Space Shuttle repair mission in January 1994 installed small corrective mirrors that restored the full intended optical capability of the HST. The First Servicing Mission (FSM) involved considerable Extra Vehicular Activity (EVA). Special EVA tools for the FSM were designed and developed for this specific purpose. In an earlier report, the details of the Data Acquisition System developed to test the performance of the various EVA tools in ambient as well as simulated space environment were presented. The general schematic of the test setup is reproduced in this report for continuity. Although the data acquisition system was used extensively to test a number of fasteners, only the results of one test each carried on various fasteners and the Power Ratchet Tool are included in this report.

  6. A strip chart recorder pattern recognition tool kit for Shuttle operations

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.

    1993-01-01

    During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.

  7. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  8. Visualization Tools for Lattice QCD - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massimo Di Pierro

    2012-03-15

    Our research project is about the development of visualization tools for Lattice QCD. We developed various tools by extending existing libraries, adding new algorithms, exposing new APIs, and creating web interfaces (including the new NERSC gauge connection web site). Our tools cover the full stack of operations from automating download of data, to generating VTK files (topological charge, plaquette, Polyakov lines, quark and meson propagators, currents), to turning the VTK files into images, movies, and web pages. Some of the tools have their own web interfaces. Some Lattice QCD visualization have been created in the past but, to our knowledge,more » our tools are the only ones of their kind since they are general purpose, customizable, and relatively easy to use. We believe they will be valuable to physicists working in the field. They can be used to better teach Lattice QCD concepts to new graduate students; they can be used to observe the changes in topological charge density and detect possible sources of bias in computations; they can be used to observe the convergence of the algorithms at a local level and determine possible problems; they can be used to probe heavy-light mesons with currents and determine their spatial distribution; they can be used to detect corrupted gauge configurations. There are some indirect results of this grant that will benefit a broader audience than Lattice QCD physicists.« less

  9. A study on the effect of tool electrode thickness on MRR, and TWR in electrical discharge turning process

    NASA Astrophysics Data System (ADS)

    Gohil, Vikas; Puri, YM

    2018-04-01

    Turning by electrical discharge machining (EDM) is an emerging area of research. Generally, wire-EDM is used in EDM turning because it is not concerned with electrode tooling cost. In EDM turning wire electrode leaves cusps on the machined surface because of its small diameters and wire breakage which greatly affect the surface finish of the machined part. Moreover, one of the limitations of the process is low machining speed as compared to constituent processes. In this study, conventional EDM was employed for turning purpose in order to generate free-form cylindrical geometries on difficult-to-cut materials. Therefore, a specially designed turning spindle was mounted on a conventional die-sinking EDM machine to rotate the work piece. A conductive preshaped strip of copper as a forming tool is fed (reciprocate) continuously against the rotating work piece; thus, a mirror image of the tool is formed on the circumference of the work piece. In this way, an axisymmetric work piece can be made with small tools. The developed process is termed as the electrical discharge turning (EDT). In the experiments, the effect of machining parameters, such as pulse-on time, peak current, gap voltage and tool thickness on the MRR, and TWR were investigated and practical machining was carried out by turning of SS-304 stainless steel work piece.

  10. Guidelines for overcoming hospital managerial challenges: a systematic literature review

    PubMed Central

    Crema, Maria; Verbano, Chiara

    2013-01-01

    Purpose The need to respond to accreditation institutes’ and patients’ requirements and to align health care results with increased medical knowledge is focusing greater attention on quality in health care. Different tools and techniques have been adopted to measure and manage quality, but clinical errors are still too numerous, suggesting that traditional quality improvement systems are unable to deal appropriately with hospital challenges. The purpose of this paper is to grasp the current tools, practices, and guidelines adopted in health care to improve quality and patient safety and create a base for future research on this young subject. Methods A systematic literature review was carried out. A search of academic databases, including papers that focus not only on lean management, but also on clinical errors and risk reduction, yielded 47 papers. The general characteristics of the selected papers were analyzed, and a content analysis was conducted. Results A variety of managerial techniques, tools, and practices are being adopted in health care, and traditional methodologies have to be integrated with the latest ones in order to reduce errors and ensure high quality and patient safety. As it has been demonstrated, these tools are useful not only for achieving efficiency objectives, but also for providing higher quality and patient safety. Critical indications and guidelines for successful implementation of new health managerial methodologies are provided and synthesized in an operative scheme useful for extending and deepening knowledge of these issues with further studies. Conclusion This research contributes to introducing a new theme in health care literature regarding the development of successful projects with both clinical risk management and health lean management objectives, and should address solutions for improving health care even in the current context of decreasing resources. PMID:24307833

  11. Diamond Smoothing Tools

    NASA Technical Reports Server (NTRS)

    Voronov, Oleg

    2007-01-01

    Diamond smoothing tools have been proposed for use in conjunction with diamond cutting tools that are used in many finish-machining operations. Diamond machining (including finishing) is often used, for example, in fabrication of precise metal mirrors. A diamond smoothing tool according to the proposal would have a smooth spherical surface. For a given finish machining operation, the smoothing tool would be mounted next to the cutting tool. The smoothing tool would slide on the machined surface left behind by the cutting tool, plastically deforming the surface material and thereby reducing the roughness of the surface, closing microcracks and otherwise generally reducing or eliminating microscopic surface and subsurface defects, and increasing the microhardness of the surface layer. It has been estimated that if smoothing tools of this type were used in conjunction with cutting tools on sufficiently precise lathes, it would be possible to reduce the roughness of machined surfaces to as little as 3 nm. A tool according to the proposal would consist of a smoothing insert in a metal holder. The smoothing insert would be made from a diamond/metal functionally graded composite rod preform, which, in turn, would be made by sintering together a bulk single-crystal or polycrystalline diamond, a diamond powder, and a metallic alloy at high pressure. To form the spherical smoothing tip, the diamond end of the preform would be subjected to flat grinding, conical grinding, spherical grinding using diamond wheels, and finally spherical polishing and/or buffing using diamond powders. If the diamond were a single crystal, then it would be crystallographically oriented, relative to the machining motion, to minimize its wear and maximize its hardness. Spherically polished diamonds could also be useful for purposes other than smoothing in finish machining: They would likely also be suitable for use as heat-resistant, wear-resistant, unlubricated sliding-fit bearing inserts.

  12. 7 CFR 2902.48 - General purpose household cleaners.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... PROCUREMENT Designated Items § 2902.48 General purpose household cleaners. (a) Definition. Products designed... procurement preference for qualifying biobased general purpose household cleaners. By that date, Federal... 7 Agriculture 15 2010-01-01 2010-01-01 false General purpose household cleaners. 2902.48 Section...

  13. Efficient Multidisciplinary Analysis Approach for Conceptual Design of Aircraft with Large Shape Change

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2009-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.

  14. Video games as a tool to train visual skills

    PubMed Central

    Achtman, R.L.; Green, C.S.; Bavelier, D.

    2010-01-01

    Purpose Adult brain plasticity, although possible, is often difficult to elicit. Training regimens in adults can produce specific improvements on the trained task without leading to general enhancements that would improve quality of life. This paper considers the case of playing action video games as a way to induce widespread enhancement in vision. Conclusions We review the range of visual skills altered by action video game playing as well as the game components important in promoting visual plasticity. Further, we discuss what these results might mean in terms of rehabilitation for different patient populations. PMID:18997318

  15. JEDI - an executive dashboard and decision support system for lean global military medical resource and logistics management.

    PubMed

    Sloane, Elliot B; Rosow, Eric; Adam, Joe; Shine, Dave

    2006-01-01

    Each individual U.S. Air Force, Army, and Navy Surgeon General has integrated oversight of global medical supplies and resources using the Joint Medical Asset Repository (JMAR). A Business Intelligence system called the JMAR Executive Dashboard Initiative (JEDI) was developed over a three-year period to add real-time interactive data-mining tools and executive dashboards. Medical resources can now be efficiently reallocated to military, veteran, family, or civilian purposes and inventories can be maintained at lean levels with peaks managed by interactive dashboards that reduce workload and errors.

  16. Finite Element Modelling and Analysis of Conventional Pultrusion Processes

    NASA Astrophysics Data System (ADS)

    Akishin, P.; Barkanov, E.; Bondarchuk, A.

    2015-11-01

    Pultrusion is one of many composite manufacturing techniques and one of the most efficient methods for producing fiber reinforced polymer composite parts with a constant cross-section. Numerical simulation is helpful for understanding the manufacturing process and developing scientific means for the pultrusion tooling design. Numerical technique based on the finite element method has been developed for the simulation of pultrusion processes. It uses the general purpose finite element software ANSYS Mechanical. It is shown that the developed technique predicts the temperature and cure profiles, which are in good agreement with those published in the open literature.

  17. Performance characteristics of three-phase induction motors

    NASA Technical Reports Server (NTRS)

    Wood, M. E.

    1977-01-01

    An investigation into the characteristics of three phase, 400 Hz, induction motors of the general type used on aircraft and spacecraft is summarized. Results of laboratory tests are presented and compared with results from a computer program. Representative motors were both tested and simulated under nominal conditions as well as off nominal conditions of temperature, frequency, voltage magnitude, and voltage balance. Good correlation was achieved between simulated and laboratory results. The primary purpose of the program was to verify the simulation accuracy of the computer program, which in turn will be used as an analytical tool to support the shuttle orbiter.

  18. Medical Imaging System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The MD Image System, a true-color image processing system that serves as a diagnostic aid and tool for storage and distribution of images, was developed by Medical Image Management Systems, Huntsville, AL, as a "spinoff from a spinoff." The original spinoff, Geostar 8800, developed by Crystal Image Technologies, Huntsville, incorporates advanced UNIX versions of ELAS (developed by NASA's Earth Resources Laboratory for analysis of Landsat images) for general purpose image processing. The MD Image System is an application of this technology to a medical system that aids in the diagnosis of cancer, and can accept, store and analyze images from other sources such as Magnetic Resonance Imaging.

  19. Target-D: a stratified individually randomized controlled trial of the diamond clinical prediction tool to triage and target treatment for depressive symptoms in general practice: study protocol for a randomized controlled trial.

    PubMed

    Gunn, Jane; Wachtler, Caroline; Fletcher, Susan; Davidson, Sandra; Mihalopoulos, Cathrine; Palmer, Victoria; Hegarty, Kelsey; Coe, Amy; Murray, Elizabeth; Dowrick, Christopher; Andrews, Gavin; Chondros, Patty

    2017-07-20

    Depression is a highly prevalent and costly disorder. Effective treatments are available but are not always delivered to the right person at the right time, with both under- and over-treatment a problem. Up to half the patients presenting to general practice report symptoms of depression, but general practitioners have no systematic way of efficiently identifying level of need and allocating treatment accordingly. Therefore, our team developed a new clinical prediction tool (CPT) to assist with this task. The CPT predicts depressive symptom severity in three months' time and based on these scores classifies individuals into three groups (minimal/mild, moderate, severe), then provides a matched treatment recommendation. This study aims to test whether using the CPT reduces depressive symptoms at three months compared with usual care. The Target-D study is an individually randomized controlled trial. Participants will be 1320 general practice patients with depressive symptoms who will be approached in the practice waiting room by a research assistant and invited to complete eligibility screening on an iPad. Eligible patients will provide informed consent and complete the CPT on a purpose-built website. A computer-generated allocation sequence stratified by practice and depressive symptom severity group, will randomly assign participants to intervention (treatment recommendation matched to predicted depressive symptom severity group) or comparison (usual care plus Target-D attention control) arms. Follow-up assessments will be completed online at three and 12 months. The primary outcome is depressive symptom severity at three months. Secondary outcomes include anxiety, mental health self-efficacy, quality of life, and cost-effectiveness. Intention-to-treat analyses will test for differences in outcome means between study arms overall and by depressive symptom severity group. To our knowledge, this is the first depressive symptom stratification tool designed for primary care which takes a prognosis-based approach to provide a tailored treatment recommendation. If shown to be effective, this tool could be used to assist general practitioners to implement stepped mental-healthcare models and contribute to a more efficient and effective mental health system. Australian New Zealand Clinical Trials Registry (ANZCTR 12616000537459 ). Retrospectively registered on 27 April 2016. See Additional file 1 for trial registration data.

  20. Increasing the Operational Value of Event Messages

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Smith, Dan

    2003-01-01

    Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.

  1. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  2. 78 FR 77662 - Notice of Availability (NOA) for General Purpose Warehouse and Information Technology Center...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-24

    ... (NOA) for General Purpose Warehouse and Information Technology Center Construction (GPW/IT)--Tracy Site.... ACTION: Notice of Availability (NOA) for General Purpose Warehouse and Information Technology Center... FR 65300) announcing the publication of the General Purpose Warehouse and Information Technology...

  3. A comparison of four geophysical methods for determining the shear wave velocity of soils

    USGS Publications Warehouse

    Anderson, N.; Thitimakorn, T.; Ismail, A.; Hoffman, D.

    2007-01-01

    The Missouri Department of Transportation (MoDOT) routinely acquires seismic cone penetrometer (SCPT) shear wave velocity control as part of the routine investigation of soils within the Mississippi Embayment. In an effort to ensure their geotechnical investigations are as effective and efficient as possible, the SCPT tool and several available alternatives (crosshole [CH]; multichannel analysis of surface waves [MASW]; and refraction microtremor [ReMi]) were evaluated and compared on the basis of field data acquired at two test sites in southeast Missouri. These four methods were ranked in terms of accuracy, functionality, cost, other considerations, and overall utility. It is concluded that MASW data are generally more reliable than SCPT data, comparable to quality ReMi data, and only slightly less accurate than CH data. However, the other advantages of MASW generally make it a superior choice over the CH, SCPT, and ReMi methods for general soil classification purposes to depths of 30 m. MASW data are less expensive than CH data and SCPT data and can normally be acquired in areas inaccessible to drill and SCPT rigs. In contrast to the MASW tool, quality ReMi data can be acquired only in areas where there are interpretable levels of "passive" acoustic energy and only when the geophone array is aligned with the source(s) of such energy.

  4. Automated subtyping of HIV-1 genetic sequences for clinical and surveillance purposes: performance evaluation of the new REGA version 3 and seven other tools.

    PubMed

    Pineda-Peña, Andrea-Clemencia; Faria, Nuno Rodrigues; Imbrechts, Stijn; Libin, Pieter; Abecasis, Ana Barroso; Deforche, Koen; Gómez-López, Arley; Camacho, Ricardo J; de Oliveira, Tulio; Vandamme, Anne-Mieke

    2013-10-01

    To investigate differences in pathogenesis, diagnosis and resistance pathways between HIV-1 subtypes, an accurate subtyping tool for large datasets is needed. We aimed to evaluate the performance of automated subtyping tools to classify the different subtypes and circulating recombinant forms using pol, the most sequenced region in clinical practice. We also present the upgraded version 3 of the Rega HIV subtyping tool (REGAv3). HIV-1 pol sequences (PR+RT) for 4674 patients retrieved from the Portuguese HIV Drug Resistance Database, and 1872 pol sequences trimmed from full-length genomes retrieved from the Los Alamos database were classified with statistical-based tools such as COMET, jpHMM and STAR; similarity-based tools such as NCBI and Stanford; and phylogenetic-based tools such as REGA version 2 (REGAv2), REGAv3, and SCUEAL. The performance of these tools, for pol, and for PR and RT separately, was compared in terms of reproducibility, sensitivity and specificity with respect to the gold standard which was manual phylogenetic analysis of the pol region. The sensitivity and specificity for subtypes B and C was more than 96% for seven tools, but was variable for other subtypes such as A, D, F and G. With regard to the most common circulating recombinant forms (CRFs), the sensitivity and specificity for CRF01_AE was ~99% with statistical-based tools, with phylogenetic-based tools and with Stanford, one of the similarity based tools. CRF02_AG was correctly identified for more than 96% by COMET, REGAv3, Stanford and STAR. All the tools reached a specificity of more than 97% for most of the subtypes and the two main CRFs (CRF01_AE and CRF02_AG). Other CRFs were identified only by COMET, REGAv2, REGAv3, and SCUEAL and with variable sensitivity. When analyzing sequences for PR and RT separately, the performance for PR was generally lower and variable between the tools. Similarity and statistical-based tools were 100% reproducible, but this was lower for phylogenetic-based tools such as REGA (~99%) and SCUEAL (~96%). REGAv3 had an improved performance for subtype B and CRF02_AG compared to REGAv2 and is now able to also identify all epidemiologically relevant CRFs. In general the best performing tools, in alphabetical order, were COMET, jpHMM, REGAv3, and SCUEAL when analyzing pure subtypes in the pol region, and COMET and REGAv3 when analyzing most of the CRFs. Based on this study, we recommend to confirm subtyping with 2 well performing tools, and be cautious with the interpretation of short sequences. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  5. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments.

    PubMed

    Thomas, Brandon R; Chylek, Lily A; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H A; Hlavacek, William S; Posner, Richard G

    2016-03-01

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary data are available at Bioinformatics online. bionetgen.help@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  7. Improved MRF spot characterization with QIS metrology

    NASA Astrophysics Data System (ADS)

    Westover, Sandi; Hall, Christopher; DeMarco, Michael

    2013-09-01

    Careful characterization of the removal function of sub-aperture polishing tools is critical for optimum polishing results. Magnetorheological finishing (MRF®) creates a polishing tool, or "spot", that is unique both for its locally high removal rate and high slope content. For a variety of reasons, which will be discussed, longer duration spots are beneficial to improving MRF performance, but longer spots yield higher slopes rendering them difficult to measure with adequate fidelity. QED's Interferometer for Stitching (QIS™) was designed to measure the high slope content inherent to non-null sub-aperture stitching interferometry of aspheres. Based on this unique capability the QIS was recently used to measure various MRF spots in an attempt to see if there was a corresponding improvement in MRF performance as a result of improved knowledge of these longer duration spots. The results of these tests will be presented and compared with those of a standard general purpose interferometer.

  8. Development of the Classroom Sensory Environment Assessment (CSEA).

    PubMed

    Kuhaneck, Heather Miller; Kelleher, Jaqueline

    2015-01-01

    The Classroom Sensory Environment Assessment (CSEA) is a tool that provides a means of understanding the impact of a classroom's sensory environment on student behavior. The purpose of the CSEA is to promote collaboration between occupational therapists and elementary education teachers. In particular, students with autism spectrum disorder included in general education classrooms may benefit from a suitable match created through this collaborative process between the sensory environment and their unique sensory preferences. The development of the CSEA has occurred in multiple stages over 2 yr. This article reports on descriptive results for 152 classrooms and initial reliability results. Descriptive information suggests that classrooms are environments with an enormous variety of sensory experiences that can be quantified. Visual experiences are most frequent. The tool has adequate internal consistency but requires further investigation of interrater reliability and validity. Copyright © 2015 by the American Occupational Therapy Association, Inc.

  9. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  10. Towards an Automated Development Methodology for Dependable Systems with Application to Sensor Networks

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.

  11. "Genetically Engineered" Nanoelectronics

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Salazar-Lazaro, Carlos H.; Stoica, Adrian; Cwik, Thomas

    2000-01-01

    The quantum mechanical functionality of nanoelectronic devices such as resonant tunneling diodes (RTDs), quantum well infrared-photodetectors (QWIPs), quantum well lasers, and heterostructure field effect transistors (HFETs) is enabled by material variations on an atomic scale. The design and optimization of such devices requires a fundamental understanding of electron transport in such dimensions. The Nanoelectronic Modeling Tool (NEMO) is a general-purpose quantum device design and analysis tool based on a fundamental non-equilibrium electron transport theory. NEW was combined with a parallelized genetic algorithm package (PGAPACK) to evolve structural and material parameters to match a desired set of experimental data. A numerical experiment that evolves structural variations such as layer widths and doping concentrations is performed to analyze an experimental current voltage characteristic. The genetic algorithm is found to drive the NEMO simulation parameters close to the experimentally prescribed layer thicknesses and doping profiles. With such a quantitative agreement between theory and experiment design synthesis can be performed.

  12. Electricity and Empire in 1920s Palestine under British Rule.

    PubMed

    Shamir, Ronen

    2016-12-01

    This article examines some techno-political aspects of the early years of electrification in British-ruled 1920s Palestine. It emphasizes the importance of local technical, topographical and hydrological forms of knowledge for understanding the dynamics of electrification. Situating the analysis in a general colonial context of electrification, the study shows that British colonial rulers lagged behind both German firms and local entrepreneurs in understanding the specific conditions pertaining to electrification in Palestine. Subsequently, the study shows that the British had limited control of the actual electrification process and its declared/professed developmental purposes, thereby complicating assumptions about electrification as a tool of the Empire/tool of empire. Finding some similarities between the cases of electrifying Palestine and India, the article's findings may shed further light on the importance of micro-politics of knowledge for understanding the trajectory of electrification in the colonies.

  13. Climate Change Risk Management Consulting: The opportunity for an independent business practice

    NASA Astrophysics Data System (ADS)

    Ciccozzi, R.

    2009-04-01

    The Paper outlines the main questions to be addressed with reference to the actual demand of climate change risk management consulting, in the financial services. Moreover, the Project shall also try to investigate if the Catastrophe Modelling Industry can start and manage a business practice specialised on climate change risk exposures. In this context, the Paper aims at testing the possibility to build a sound business case, based upon typical MBA course analysis tools, such as PEST(LE), SWOT, etc. Specific references to the tools to be used and to other contribution from academic literature and general documentation are also discussed in the body of the Paper and listed at the end. The analysis shall also focus on the core competencies required for an independent climate change risk management consulting business practice, with the purpose to outline a valid definition of how to achieve competitive advantage in climate change risk management consulting.

  14. LipidMiner: A Software for Automated Identification and Quantification of Lipids from Multiple Liquid Chromatography-Mass Spectrometry Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Da; Zhang, Qibin; Gao, Xiaoli

    2014-04-30

    We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less

  15. KISS for STRAP: user extensions for a protein alignment editor.

    PubMed

    Gille, Christoph; Lorenzen, Stephan; Michalsky, Elke; Frömmel, Cornelius

    2003-12-12

    The Structural Alignment Program STRAP is a comfortable comprehensive editor and analyzing tool for protein alignments. A wide range of functions related to protein sequences and protein structures are accessible with an intuitive graphical interface. Recent features include mapping of mutations and polymorphisms onto structures and production of high quality figures for publication. Here we address the general problem of multi-purpose program packages to keep up with the rapid development of bioinformatical methods and the demand for specific program functions. STRAP was remade implementing a novel design which aims at Keeping Interfaces in STRAP Simple (KISS). KISS renders STRAP extendable to bio-scientists as well as to bio-informaticians. Scientists with basic computer skills are capable of implementing statistical methods or embedding existing bioinformatical tools in STRAP themselves. For bio-informaticians STRAP may serve as an environment for rapid prototyping and testing of complex algorithms such as automatic alignment algorithms or phylogenetic methods. Further, STRAP can be applied as an interactive web applet to present data related to a particular protein family and as a teaching tool. JAVA-1.4 or higher. http://www.charite.de/bioinf/strap/

  16. A literature search tool for intelligent extraction of disease-associated genes.

    PubMed

    Jung, Jae-Yoon; DeLuca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2014-01-01

    To extract disorder-associated genes from the scientific literature in PubMed with greater sensitivity for literature-based support than existing methods. We developed a PubMed query to retrieve disorder-related, original research articles. Then we applied a rule-based text-mining algorithm with keyword matching to extract target disorders, genes with significant results, and the type of study described by the article. We compared our resulting candidate disorder genes and supporting references with existing databases. We demonstrated that our candidate gene set covers nearly all genes in manually curated databases, and that the references supporting the disorder-gene link are more extensive and accurate than other general purpose gene-to-disorder association databases. We implemented a novel publication search tool to find target articles, specifically focused on links between disorders and genotypes. Through comparison against gold-standard manually updated gene-disorder databases and comparison with automated databases of similar functionality we show that our tool can search through the entirety of PubMed to extract the main gene findings for human diseases rapidly and accurately.

  17. Organizing Community-Based Data Standards: Lessons from Developing a Successful Open Standard in Systems Biology

    NASA Astrophysics Data System (ADS)

    Hucka, M.

    2015-09-01

    In common with many fields, including astronomy, a vast number of software tools for computational modeling and simulation are available today in systems biology. This wealth of resources is a boon to researchers, but it also presents interoperability problems. Despite working with different software tools, researchers want to disseminate their work widely as well as reuse and extend the models of other researchers. This situation led in the year 2000 to an effort to create a tool-independent, machine-readable file format for representing models: SBML, the Systems Biology Markup Language. SBML has since become the de facto standard for its purpose. Its success and general approach has inspired and influenced other community-oriented standardization efforts in systems biology. Open standards are essential for the progress of science in all fields, but it is often difficult for academic researchers to organize successful community-based standards. I draw on personal experiences from the development of SBML and summarize some of the lessons learned, in the hope that this may be useful to other groups seeking to develop open standards in a community-oriented fashion.

  18. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  19. Transputer parallel processing at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1989-01-01

    The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.

  20. Bringing the Virtual Astronomical Observatory to the Education Community

    NASA Astrophysics Data System (ADS)

    Lawton, B.; Eisenhamer, B.; Mattson, B. J.; Raddick, M. J.

    2012-08-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. The Education and Public Outreach (EPO) program for the VAO will be led by the Space Telescope Science Institute in collaboration with the High Energy Astrophysics Science Archive Research Center (HEASARC) EPO program and Johns Hopkins University. VAO EPO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public and education community. Our EPO efforts will be structured to provide uniform access to VAO information, enabling educational and research opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that the VO has already built many tools for EPO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. However, it is not enough to simply provide tools. Tools must meet the needs of the education community and address national education standards in order to be broadly utilized. To determine which tools the VAO will incorporate into the EPO program, needs assessments will be conducted with educators across the U.S.

  1. Training generalized improvisation of tools by preschool children1

    PubMed Central

    Parsonson, Barry S.; Baer, Donald M.

    1978-01-01

    The development of new, “creative” behaviors was examined in a problem-solving context. One form of problem solving, improvisation, was defined as finding a substitute to replace the specifically designated, but currently unavailable, tool ordinarily used to solve the problem. The study examined whether preschool children spontaneously displayed generalized improvisation skills, and if not, whether they could be trained to do so within different classes of tools. Generalization across different tool classes was monitored but not specifically trained. Five preschool children participated in individual sessions that first probed their skill at improvising tools, and later trained and probed generalized improvisation in one or more of three tool classes (Hammers, Containers, and Shoelaces), using a multiple-baseline design. All five children were trained with Hammers, two were trained in two classes, and two were trained in all three tool classes. Four of the five children improvised little in Baseline. During Training, all five showed increased generalized improvisation within the trained class, but none across classes. Tools fabricated by item combinations were rare in Baseline, but common in Training. Followup probes showed that the training effects were durable. PMID:16795596

  2. Moving toward a standard for spinal fusion outcomes assessment.

    PubMed

    Blount, Kevin J; Krompinger, W Jay; Maljanian, Rose; Browner, Bruce D

    2002-02-01

    Previous spinal fusion outcomes assessment studies have been complicated by inconsistencies in evaluative criteria and consequent variations in results. As a result, a general consensus is lacking on how to achieve comprehensive outcomes assessment for spinal fusion surgeries. The purpose of this article is to report the most validated and frequently used assessment measures to facilitate comparable outcomes studies in the future. Twenty-seven spinal fusion outcomes studies published between 1990 and 2000 were retrospectively reviewed. Study characteristics such as design, evaluative measures, and assessment tools were recorded and analyzed. Based on the reviewed literature, an outcomes assessment model is proposed including the Short Form-36 Health Survey, the Oswestry Disability Questionnaire, the North American Spine Society Patient Satisfaction Index, the Prolo Economic Scale, a 0-10 analog pain scale, medication use, radiographically assessed fusion status, and a generalized complication rate.

  3. Peer-supported review of teaching: an evaluation.

    PubMed

    Thampy, Harish; Bourke, Michael; Naran, Prasheena

    2015-09-01

    Peer-supported review (also called peer observation) of teaching is a commonly implemented method of ascertaining teaching quality that supplements student feedback. A large variety of scheme formats with rather differing purposes are described in the literature. They range from purely formative, developmental formats that facilitate a tutor's reflection of their own teaching to reaffirm strengths and identify potential areas for development through to faculty- or institution-driven summative quality assurance-based schemes. Much of the current literature in this field focuses within general higher education and on the development of rating scales, checklists or observation tools to help guide the process. This study reports findings from a qualitative evaluation of a purely formative peer-supported review of teaching scheme that was implemented for general practice clinical tutors at our medical school and describes tutors' attitudes and perceived benefits and challenges when undergoing observation.

  4. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

  5. Modified Ride-on Toy Cars for Early Power Mobility: A Technical Report

    PubMed Central

    Huang, Hsiang-han; Galloway, James C

    2012-01-01

    Background and Purpose Children with significantly decreased mobility have limited opportunities to explore their physical and social environment. A variety of assistive technologies are available to increase mobility, however no single device provides the level of functional mobility that typically developing children enjoy. The purpose of this technical report is to formally introduce a new power mobility option - the modified ride-on toy car. Key Points This report will provide a) an overview of toy car features, b) examples of basic electrical and mechanical modifications and c) a brief clinical case. Clinical Implications With creative use and customized modifications, toy cars function as a “general learning environment” for use in the clinic, home and school. As such, we anticipate that these cars will become a multi-use clinical tool to address not only mobility goals but also goals involving body function and structure such as posture and movement impairments. PMID:22466382

  6. Introduction: From "The Popularization of Science through Film" to "The Public Understanding of Science".

    PubMed

    Vidal, Fernando

    2018-03-01

    Science in film, and usual equivalents such as science on film or science on screen, refer to the cinematographic representation, staging, and enactment of actors, information, and processes involved in any aspect or dimension of science and its history. Of course, boundaries are blurry, and films shot as research tools or documentation also display science on screen. Nonetheless, they generally count as scientific film, and science in and on film or screen tend to designate productions whose purpose is entertainment and education. Moreover, these two purposes are often combined, and inherently concern empirical, methodological, and conceptual challenges associated with popularization, science communication, and the public understanding of science. It is in these areas that the notion of the deficit model emerged to designate a point of view and a mode of understanding, as well as a set of practical and theoretical problems about the relationship between science and the public.

  7. Development of efficient and cost-effective distributed hydrological modeling tool MWEasyDHM based on open-source MapWindow GIS

    NASA Astrophysics Data System (ADS)

    Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao

    2011-09-01

    Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.

  8. An evaluation of a handheld spectroradiometer for the near real-time measurement of cyanobacteria for bloom management purposes.

    PubMed

    Bowling, Lee C; Shaikh, Mustak; Brayan, John; Malthus, Tim

    2017-09-09

    A commercially available handheld spectroradiometer, the WISP-3, was assessed as a tool for monitoring freshwater cyanobacterial blooms for management purposes. Three small eutrophic urban ponds which displayed considerable within-pond and between-pond variability in water quality and cyanobacterial community composition were used as trial sites. On-board algorithms provide field measurements of phycocyanin (CPC) and chlorophyll-a (Chl-a) from surface reflectance spectra measured by the instrument. These were compared with laboratory measurements. Although significant but weak relationships were found between WISP-3 measured CPC and cyanobacterial biovolume measurements and WISP-3 and laboratory Chl-a measurements, there was considerable scatter in the data due likely to error in both WISP-3 and laboratory measurements. The relationships generally differed only slightly between ponds, indicating that different cyanobacterial communities had little effect on the pigment retrievals of the WISP-3. The on-board algorithms need appropriate modification for local conditions, posing a problem if it is to be used extensively across water bodies with differing optical properties. Although suffering a range of other limitations, the WISP-3 has a potential as a rapid screening tool for preliminary risk assessment of cyanobacterial blooms. However, such field assessment would still require adequate support by sampling and laboratory-based analysis.

  9. Balancing stability and flexibility in adaptive governance: an ...

    EPA Pesticide Factsheets

    Adaptive governance must work “on the ground,” that is, it must operate through structures and procedures that the people it governs perceive to be legitimate and fair, as well as incorporating processes and substantive goals that are effective in allowing social-ecological systems (SESs) to adapt to climate change and other impacts. To address the continuing and accelerating alterations that climate change is bringing to SESs, adaptive governance generally will require more flexibility than prior governance institutions have often allowed. However, to function as good governance, adaptive governance must pay real attention to the problem of how to balance this increased need for flexibility with continuing governance stability so that it can foster adaptation to change without being perceived or experienced as perpetually destabilizing, disruptive, and unfair. Flexibility and stability serve different purposes in governance, and a variety of tools exist to strike different balances between them while still preserving the governance institution’s legitimacy among the people governed. After reviewing those purposes and the implications of climate change for environmental governance, we examine psychological insights into the structuring of adaptive governance and the variety of legal tools available to incorporate those insights into adaptive governance regimes. Because the substantive goals of governance systems will differ among specific systems, we do no

  10. Development and validation of the positive affect and well-being scale for the neurology quality of life (Neuro-QOL) measurement system.

    PubMed

    Salsman, John M; Victorson, David; Choi, Seung W; Peterman, Amy H; Heinemann, Allen W; Nowinski, Cindy; Cella, David

    2013-11-01

    To develop and validate an item-response theory-based patient-reported outcomes assessment tool of positive affect and well-being (PAW). This is part of a larger NINDS-funded study to develop a health-related quality of life measurement system across major neurological disorders, called Neuro-QOL. Informed by a literature review and qualitative input from clinicians and patients, item pools were created to assess PAW concepts. Items were administered to a general population sample (N = 513) and a group of individuals with a variety of neurologic conditions (N = 581) for calibration and validation purposes, respectively. A 23-item calibrated bank and a 9-item short form of PAW was developed, reflecting components of positive affect, life satisfaction, or an overall sense of purpose and meaning. The Neuro-QOL PAW measure demonstrated sufficient unidimensionality and displayed good internal consistency, test-retest reliability, model fit, convergent and discriminant validity, and responsiveness. The Neuro-QOL PAW measure was designed to aid clinicians and researchers to better evaluate and understand the potential role of positive health processes for individuals with chronic neurological conditions. Further psychometric testing within and between neurological conditions, as well as testing in non-neurologic chronic diseases, will help evaluate the generalizability of this new tool.

  11. A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data

    NASA Astrophysics Data System (ADS)

    Li, Z.; Hodgson, M.; Li, W.

    2016-12-01

    Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.

  12. Investigation with respect to content and general properties of physics 11 textbook in accordance with the 2013 secondary school physics curriculum

    NASA Astrophysics Data System (ADS)

    Kavcar, Nevzat; Özen, Ali Ihsan

    2017-02-01

    Purpose of this work is to determine the physics teacher candidates' views on Physics 11 textbook' content and general properties suitable to the 2013 Secondary School Physics Curriculum. 24 teacher candidates at 2015-2016 school year constituted the sampling of the study in which scanning model based on qualitative research technique was used by performing document analysis. Data collection tool of the research was the files prepared with 51 and 28 open ended questions including the subject content and general properties of the textbook. It was concluded that the textbook was sufficient in terms of discussion, investigation, daily life context, visual elements, permanent learning traces; but was insufficient for design elements and being only one project in Electricity and Magnetism unit. Affective area activities may be involved in the textbook, there may be teacher guide book and book' teaching packet, and underline issues and qualification of the textbook may be improved.

  13. Gem and mineral identification using GL Gem Raman and comparison with other portable instruments

    NASA Astrophysics Data System (ADS)

    Culka, Adam; Hyršl, Jaroslav; Jehlička, Jan

    2016-11-01

    Several mainly silicate minerals in their gemstone varieties have been analysed by the Gem Raman portable system by Gemlab R&T, Vancouver, Canada, in order to ascertain the general performance of this relatively non-expensive tool developed exactly for the purpose of gemstone identification. The Raman spectra of gemstones acquired by this system have been subsequently critically compared with the data obtained by several other portable or handheld Raman instruments. The Raman spectra acquired with the Gem Raman instrument were typically of lesser quality when compared with the spectra taken by other instruments. Characteristic features such as steep baseline probably due to the fluorescence of the minerals, Raman bands much broader and therefore less resolved closely located Raman bands, and generally greater shifts of the band positions from the reference values were encountered. Some gemstone groups such as rubies did not provide useful Raman spectra at all. Nevertheless, general identification of gemstones was possible for a selection of gemstones.

  14. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  15. Web-Based Learning Design Tool

    ERIC Educational Resources Information Center

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  16. Supporting Literacy in Preschool: Using a Teacher-Observation Tool to Guide Professional Development

    ERIC Educational Resources Information Center

    McNerney, Shelly; Nielsen, Diane Corcoran; Clay, Phyllis

    2006-01-01

    Teachers involved with professional-development opportunities inevitably differ in their content knowledge, access to resources, and instructional practices. The purpose of this study was to investigate how a standardized assessment observation tool, selected to gather summative information for grant-evaluation purposes about preschool teachers'…

  17. Education and Outreach with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.

    2012-01-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.

  18. Physicians’ experience adopting the electronic transfer of care communication tool: barriers and opportunities

    PubMed Central

    de Grood, Chloe; Eso, Katherine; Santana, Maria Jose

    2015-01-01

    Purpose The purpose of this study was to assess physicians’ perceptions on a newly developed electronic transfer of care (e-TOC) communication tool and identify barriers and opportunities toward its adoption. Participants and methods The study was conducted in a tertiary care teaching center as part of a randomized controlled trial assessing the efficacy of an e-TOC communication tool. The e-TOC technology was developed through iterative consultation with stakeholders. This e-TOC summary was populated by acute care physicians (AcPs) and communicated electronically to community care physicians (CcPs). The AcPs consisted of attending physicians, resident trainees, and medical students rotating through the Medical Teaching Unit. The CcPs were health care providers caring for patients discharged from hospital to the community. AcPs and CcPs completed validated surveys assessing their experience with the newly developed e-TOC tool. Free text questions were added to gather general comments from both groups of physicians. Units of analysis were individual physicians. Data from the surveys were analyzed using mixed methods. Results AcPs completed 138 linked pre- and post-rotation surveys. At post-rotation, each AcP completed an average of six e-TOC summaries, taking an average of 37 minutes per e-TOC summary. Over 100 CcPs assessed the quality of the TOC summaries, with an overall rating of 8.3 (standard deviation: 1.48; on a scale of 1–10). Thematic analyses revealed barriers and opportunities encountered by physicians toward the adoption of the e-TOC tool. While the AcPs highlighted issues with timeliness, usability, and presentation, the CcPs identified barriers accessing the web-based TOC summaries, emphasizing that the summaries were timely and the quality of information supported continuity of care. Conclusion Despite the barriers identified by both groups of physicians, the e-TOC communication tool was well received. Our experience can serve as a template for other health research teams considering the implementation of e-health technologies into health care systems. PMID:25609977

  19. Scientists' sense making when hypothesizing about disease mechanisms from expression data and their needs for visualization support.

    PubMed

    Mirel, Barbara; Görg, Carsten

    2014-04-26

    A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists' analytical workflows and their implications for tool design.

  20. Scientists’ sense making when hypothesizing about disease mechanisms from expression data and their needs for visualization support

    PubMed Central

    2014-01-01

    A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists’ analytical workflows and their implications for tool design. PMID:24766796

  1. Cross-cultural acceptability and utility of the strengths and difficulties questionnaire: views of families.

    PubMed

    Kersten, Paula; Dudley, Margaret; Nayar, Shoba; Elder, Hinemoa; Robertson, Heather; Tauroa, Robyn; McPherson, Kathryn M

    2016-10-12

    Screening children for behavioural difficulties requires the use of a tool that is culturally valid. We explored the cross-cultural acceptability and utility of the Strengths and Difficulties Questionnaire for pre-school children (aged 3-5) as perceived by families in New Zealand. A qualitative interpretive descriptive study (focus groups and interviews) in which 65 participants from five key ethnic groups (New Zealand European, Māori, Pacific, Asian and other immigrant parents) took part. Thematic analysis using an inductive approach, in which the themes identified are strongly linked to the data, was employed. Many parents reported they were unclear about the purpose of the tool, affecting its perceived value. Participants reported not understanding the context in which they should consider the questions and had difficulty understanding some questions and response options. Māori parents generally did not support the questionnaire based approach, preferring face to face interaction. Parents from Māori, Pacific Island, Asian, and new immigrant groups reported the tool lacked explicit consideration of children in their cultural context. Parents discussed the importance of timing and multiple perspectives when interpreting scores from the tool. In summary, this study posed a number of challenges to the use of the Strengths and Difficulties Questionnaire in New Zealand. Further work is required to develop a tool that is culturally appropriate with good content validity.

  2. Validity and reliability of the Turkish version of the DSM-5 Generalized Anxiety Disorder Severity Scale for children aged 11–17 years

    PubMed

    Yalın Sapmaz, Şermin; Özek Erkuran, Handan; Ergin, Dilek; Öztürk, Masum; Şen Celasin, Nesrin; Karaarslan, Duygu; Aydemir, Ömer

    2018-02-23

    Background/aim: This study aimed to assess the validity and reliability of the Turkish version of the DSM-5 Generalized Anxiety Disorder Severity Scale - Child Form. Materials and methods: The study sample consisted of 32 patients treated in a child psychiatry unit and diagnosed with generalized anxiety disorder and 98 healthy volunteers who were attending middle or high school during the study period. For the assessment, the Screen for Child Anxiety and Related Emotional Disorders (SCARED) was also used along with the DSM-5 Generalized Anxiety Disorder Severity Scale - Child Form. Results: Regarding reliability analyses, the Cronbach alpha internal consistency coefficient was calculated as 0.932. The test-retest correlation coefficient was calculated as r = 0.707. As for construct validity, one factor that could explain 62.6% of the variance was obtained and this was consistent with the original construct of the scale. As for concurrent validity, the scale showed a high correlation with SCARED. Conclusion: It was concluded that Turkish version of the DSM-5 Generalized Anxiety Disorder Severity Scale - Child Form could be utilized as a valid and reliable tool both in clinical practice and for research purposes.

  3. SentiHealth-Cancer: A sentiment analysis tool to help detecting mood of patients in online social networks.

    PubMed

    Rodrigues, Ramon Gouveia; das Dores, Rafael Marques; Camilo-Junior, Celso G; Rosa, Thierson Couto

    2016-01-01

    Cancer is a critical disease that affects millions of people and families around the world. In 2012 about 14.1 million new cases of cancer occurred globally. Because of many reasons like the severity of some cases, the side effects of some treatments and death of other patients, cancer patients tend to be affected by serious emotional disorders, like depression, for instance. Thus, monitoring the mood of the patients is an important part of their treatment. Many cancer patients are users of online social networks and many of them take part in cancer virtual communities where they exchange messages commenting about their treatment or giving support to other patients in the community. Most of these communities are of public access and thus are useful sources of information about the mood of patients. Based on that, Sentiment Analysis methods can be useful to automatically detect positive or negative mood of cancer patients by analyzing their messages in these online communities. The objective of this work is to present a Sentiment Analysis tool, named SentiHealth-Cancer (SHC-pt), that improves the detection of emotional state of patients in Brazilian online cancer communities, by inspecting their posts written in Portuguese language. The SHC-pt is a sentiment analysis tool which is tailored specifically to detect positive, negative or neutral messages of patients in online communities of cancer patients. We conducted a comparative study of the proposed method with a set of general-purpose sentiment analysis tools adapted to this context. Different collections of posts were obtained from two cancer communities in Facebook. Additionally, the posts were analyzed by sentiment analysis tools that support the Portuguese language (Semantria and SentiStrength) and by the tool SHC-pt, developed based on the method proposed in this paper called SentiHealth. Moreover, as a second alternative to analyze the texts in Portuguese, the collected texts were automatically translated into English, and submitted to sentiment analysis tools that do not support the Portuguese language (AlchemyAPI and Textalytics) and also to Semantria and SentiStrength, using the English option of these tools. Six experiments were conducted with some variations and different origins of the collected posts. The results were measured using the following metrics: precision, recall, F1-measure and accuracy The proposed tool SHC-pt reached the best averages for accuracy and F1-measure (harmonic mean between recall and precision) in the three sentiment classes addressed (positive, negative and neutral) in all experimental settings. Moreover, the worst accuracy value (58%) achieved by SHC-pt in any experiment is 11.53% better than the greatest accuracy (52%) presented by other addressed tools. Finally, the worst average F1 (48.46%) reached by SHC-pt in any experiment is 4.14% better than the greatest average F1 (46.53%) achieved by other addressed tools. Thus, even when we compare the SHC-pt results in complex scenario versus others in easier scenario the SHC-pt is better. This paper presents two contributions. First, it proposes the method SentiHealth to detect the mood of cancer patients that are also users of communities of patients in online social networks. Second, it presents an instantiated tool from the method, called SentiHealth-Cancer (SHC-pt), dedicated to automatically analyze posts in communities of cancer patients, based on SentiHealth. This context-tailored tool outperformed other general-purpose sentiment analysis tools at least in the cancer context. This suggests that the SentiHealth method could be instantiated as other disease-based tools during future works, for instance SentiHealth-HIV, SentiHealth-Stroke and SentiHealth-Sclerosis. Copyright © 2015. Published by Elsevier Ireland Ltd.

  4. The PHQ-PD as a Screening Tool for Panic Disorder in the Primary Care Setting in Spain

    PubMed Central

    Wood, Cristina Mae; Ruíz-Rodríguez, Paloma; Tomás-Tomás, Patricia; Gracia-Gracia, Irene; Dongil-Collado, Esperanza; Iruarrizaga, M. Iciar

    2016-01-01

    Introduction Panic disorder is a common anxiety disorder and is highly prevalent in Spanish primary care centres. The use of validated tools can improve the detection of panic disorder in primary care populations, thus enabling referral for specialized treatment. The aim of this study is to determine the accuracy of the Patient Health Questionnaire-Panic Disorder (PHQ-PD) as a screening and diagnostic tool for panic disorder in Spanish primary care centres. Method We compared the psychometric properties of the PHQ-PD to the reference standard, the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I) interview. General practitioners referred 178 patients who completed the entire PHQ test, including the PHQ-PD, to undergo the SCID-I. The sensitivity, specificity, positive and negative predictive values and positive and negative likelihood ratios of the PHQ-PD were assessed. Results The operating characteristics of the PHQ-PD are moderate. The best cut-off score was 5 (sensitivity .77, specificity .72). Modifications to the questionnaire's algorithms improved test characteristics (sensitivity .77, specificity .72) compared to the original algorithm. The screening question alone yielded the highest sensitivity score (.83). Conclusion Although the modified algorithm of the PHQ-PD only yielded moderate results as a diagnostic test for panic disorder, it was better than the original. Using only the first question of the PHQ-PD showed the best psychometric properties (sensitivity). Based on these findings, we suggest the use of the screening questions for screening purposes and the modified algorithm for diagnostic purposes. PMID:27525977

  5. Can social support work virtually? Evaluation of rheumatoid arthritis patients' experiences with an interactive online tool.

    PubMed

    Kostova, Zlatina; Caiata-Zufferey, Maria; Schulz, Peter J

    2015-01-01

    There is strong empirical evidence that the support that chronic patients receive from their environment is fundamental for the way they cope with physical and psychological suffering. Nevertheless, in the case of rheumatoid arthritis (RA), providing the appropriate social support is still a challenge, and such support has often proven to be elusive and unreliable in helping patients to manage the disease. To explore whether and how social support for RA patients can be provided online, and to assess the conditions under which such support is effective. An online support tool was designed to provide patients with both tailored information and opportunities to interact online with health professionals and fellow sufferers. The general purpose was to identify where the support provided did - or did not - help patients, and to judge whether the determinants of success lay more within patients - their engagement and willingness to participate - or within the design of the website itself. The present study reports qualitative interviews with 19 users of the tool. A more specific purpose was to elaborate qualitatively on results from a quantitative survey of users, which indicated that any positive impact was confined to practical matters of pain management rather than extending to more fundamental psychological outcomes such as acceptance. Overall, online learning and interaction can do much to help patients with the everyday stresses of their disease; however, its potential for more durable positive impact depends on various individual characteristics such as personality traits, existing social networks, and the severity and longevity of the disease.

  6. Health Technology Assessment, International Reference Pricing, and Budget Control Tools from China's Perspective: What Are the Current Developments and Future Considerations?

    PubMed

    Koh, Liling; Glaetzer, Christoph; Chuen Li, Shu; Zhang, Meng

    2016-05-01

    China is investing considerably in health care reforms to address issues in its health care system. An example is access to innovative drugs, which remains challenging because it is largely dependent on patient self-pay. Recognizing this, the government has invested considerably in its basic medical insurance. As health care expenditure increases, there are growing concerns on budget control. Several health policy tools have been discussed recently such as health technology assessment, international reference pricing, and hospital budget control tools, which can be viewed as addressing the affordability concerns of the government budget. China has also listed her health outcomes goals in "Healthy China 2020" initiative. This article aimed to discuss the "fit-for-purpose" of these tools to address budget concerns and support China in reaching her health outcomes goals. The findings are informed by a panel discussion at ISPOR Asia Pacific 2014, literature review, and authors' experience. This review looks at the current developments in China and the considerations and implications for using these tools by drawing experiences from countries where they are used. These tools are generally used in countries with advanced health care systems. China's health care spending is still below that of countries with advanced health care systems and below World Health Organization recommendation. China has not yet reached the "critical mass" necessary for the effective use of these tools. As China continues its health care reforms, increase in health care spending to balance the health needs of the population would be key. Copyright © 2015. Published by Elsevier Inc.

  7. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  8. BuddySuite: Command-Line Toolkits for Manipulating Sequences, Alignments, and Phylogenetic Trees.

    PubMed

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-06-01

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite_wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution 2017. This work is written by US Government employees and is in the public domain in the US.

  9. Using NERSC High-Performance Computing (HPC) systems for high-energy nuclear physics applications with ALICE

    NASA Astrophysics Data System (ADS)

    Fasel, Markus

    2016-10-01

    High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.

  10. Agile Task Tracking Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, Roger T.; Crump, Thomas Vu

    The work was created to provide a tool for the purpose of improving the management of tasks associated with Agile projects. Agile projects are typically completed in an iterative manner with many short duration tasks being performed as part of iterations. These iterations are generally referred to as sprints. The objective of this work is to create a single tool that enables sprint teams to manage all of their tasks in multiple sprints and automatically produce all standard sprint performance charts with minimum effort. The format of the printed work is designed to mimic a standard Kanban board. The workmore » is developed as a single Excel file with worksheets capable of managing up to five concurrent sprints and up to one hundred tasks. It also includes a summary worksheet providing performance information from all active sprints. There are many commercial project management systems typically designed with features desired by larger organizations with many resources managing multiple programs and projects. The audience for this work is the small organizations and Agile project teams desiring an inexpensive, simple, user-friendly, task management tool. This work uses standard readily available software, Excel, requiring minimum data entry and automatically creating summary charts and performance data. It is formatted to print out and resemble standard flip charts and provide the visuals associated with this type of work.« less

  11. KINECTATION (Kinect for Presentation): Control Presentation with Interactive Board and Record Presentation with Live Capture Tools

    NASA Astrophysics Data System (ADS)

    Sutoyo, Rhio; Herriyandi; Fennia Lesmana, Tri; Susanto, Edy

    2017-01-01

    Presentation is one the most common activity performed in various fields of work (e.g. lecturer, employee, manager, etc.). The purpose of presentation is to demonstrate or introduce presenters’ idea to the attendees. Within the given time and specific place, presenters must transfer their knowledge and leave great impression for their audience. Generally, presenters use several handy tools such as mouse, presenter, and webcam to help them to navigate their slides. Nevertheless, some of these tools have several constraints and limitations such as not portable and does not support multimedia. In this research, we develop an application that assist presenters to control their presentation materials by using Microsoft KINECT. In this research, we manipulate colour image, image depth, and the skeleton of the presenters captured by the KINECT. Then, we show the post-process image results into the projector screen. The KINECT has more useful than other tools because it supports video and audio recording. Moreover, it also able to capture presenters’ movement that can be used as an input to interact and manipulate the content (i.e. by touching the projection wall). Not only this application provides an alternative in controlling presentation activity, but it also makes the presentation more efficient and attractive.

  12. 24 CFR 902.1 - Purpose and general description.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... URBAN DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM General Provisions § 902.1 Purpose and general description. (a) Purpose. The purpose of the Public Housing Assessment System (PHAS) is to improve the delivery of services in public housing and enhance trust in the public housing system among public housing...

  13. Effects of Collaborative Activities on Group Identity in Virtual World

    ERIC Educational Resources Information Center

    Park, Hyungsung; Seo, Sumin

    2013-01-01

    The purpose of this study was to analyze the effects of collaborative activities on group identity in a virtual world such as "Second Life." To achieve this purpose, this study adopted events that promoted participants' interactions using tools inherent in "Second Life." The interactive tools given to the control group in this…

  14. Educational Leadership Effectiveness: A Rasch Analysis

    ERIC Educational Resources Information Center

    Sinnema, Claire; Ludlow, Larry; Robinson, Viviane

    2016-01-01

    Purpose: The purposes of this paper are, first, to establish the psychometric properties of the ELP tool, and, second, to test, using a Rasch item response theory analysis, the hypothesized progression of challenge presented by the items included in the tool. Design/ Methodology/ Approach: Data were collected at two time points through a survey of…

  15. Assessing Competencies Needed to Engage With Digital Health Services: Development of the eHealth Literacy Assessment Toolkit.

    PubMed

    Karnoe, Astrid; Furstrand, Dorthe; Christensen, Karl Bang; Norgaard, Ole; Kayser, Lars

    2018-05-10

    To achieve full potential in user-oriented eHealth projects, we need to ensure a match between the eHealth technology and the user's eHealth literacy, described as knowledge and skills. However, there is a lack of multifaceted eHealth literacy assessment tools suitable for screening purposes. The objective of our study was to develop and validate an eHealth literacy assessment toolkit (eHLA) that assesses individuals' health literacy and digital literacy using a mix of existing and newly developed scales. From 2011 to 2015, scales were continuously tested and developed in an iterative process, which led to 7 tools being included in the validation study. The eHLA validation version consisted of 4 health-related tools (tool 1: "functional health literacy," tool 2: "health literacy self-assessment," tool 3: "familiarity with health and health care," and tool 4: "knowledge of health and disease") and 3 digitally-related tools (tool 5: "technology familiarity," tool 6: "technology confidence," and tool 7: "incentives for engaging with technology") that were tested in 475 respondents from a general population sample and an outpatient clinic. Statistical analyses examined floor and ceiling effects, interitem correlations, item-total correlations, and Cronbach coefficient alpha (CCA). Rasch models (RM) examined the fit of data. Tools were reduced in items to secure robust tools fit for screening purposes. Reductions were made based on psychometrics, face validity, and content validity. Tool 1 was not reduced in items; it consequently consists of 10 items. The overall fit to the RM was acceptable (Anderson conditional likelihood ratio, CLR=10.8; df=9; P=.29), and CCA was .67. Tool 2 was reduced from 20 to 9 items. The overall fit to a log-linear RM was acceptable (Anderson CLR=78.4, df=45, P=.002), and CCA was .85. Tool 3 was reduced from 23 to 5 items. The final version showed excellent fit to a log-linear RM (Anderson CLR=47.7, df=40, P=.19), and CCA was .90. Tool 4 was reduced from 12 to 6 items. The fit to a log-linear RM was acceptable (Anderson CLR=42.1, df=18, P=.001), and CCA was .59. Tool 5 was reduced from 20 to 6 items. The fit to the RM was acceptable (Anderson CLR=30.3, df=17, P=.02), and CCA was .94. Tool 6 was reduced from 5 to 4 items. The fit to a log-linear RM taking local dependency (LD) into account was acceptable (Anderson CLR=26.1, df=21, P=.20), and CCA was .91. Tool 7 was reduced from 6 to 4 items. The fit to a log-linear RM taking LD and differential item functioning into account was acceptable (Anderson CLR=23.0, df=29, P=.78), and CCA was .90. The eHLA consists of 7 short, robust scales that assess individual's knowledge and skills related to digital literacy and health literacy. ©Astrid Karnoe, Dorthe Furstrand, Karl Bang Christensen, Ole Norgaard, Lars Kayser. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 10.05.2018.

  16. 22 CFR 309.1 - General purpose.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-tax debts owed to Peace Corps and to the United States. ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true General purpose. 309.1 Section 309.1 Foreign Relations PEACE CORPS DEBT COLLECTION General Provisions § 309.1 General purpose. This part prescribes the...

  17. 47 CFR 32.6124 - General purpose computers expense.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...

  18. 47 CFR 32.6124 - General purpose computers expense.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...

  19. 47 CFR 32.6124 - General purpose computers expense.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...

  20. 47 CFR 32.6124 - General purpose computers expense.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...

  1. 47 CFR 32.6124 - General purpose computers expense.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...

  2. Thermoplastic welding apparatus and method

    DOEpatents

    Matsen, Marc R.; Negley, Mark A.; Geren, William Preston; Miller, Robert James

    2017-03-07

    A thermoplastic welding apparatus includes a thermoplastic welding tool, at least one tooling surface in the thermoplastic welding tool, a magnetic induction coil in the thermoplastic welding tool and generally encircling the at least one tooling surface and at least one smart susceptor in the thermoplastic welding tool at the at least one tooling surface. The magnetic induction coil is adapted to generate a magnetic flux field oriented generally parallel to a plane of the at least one smart susceptor.

  3. Underground coal mine instrumentation and test

    NASA Technical Reports Server (NTRS)

    Burchill, R. F.; Waldron, W. D.

    1976-01-01

    The need to evaluate mechanical performance of mine tools and to obtain test performance data from candidate systems dictate that an engineering data recording system be built. Because of the wide range of test parameters which would be evaluated, a general purpose data gathering system was designed and assembled to permit maximum versatility. A primary objective of this program was to provide a specific operating evaluation of a longwall mining machine vibration response under normal operating conditions. A number of mines were visited and a candidate for test evaluation was selected, based upon management cooperation, machine suitability, and mine conditions. Actual mine testing took place in a West Virginia mine.

  4. Dynamics of a linear system coupled to a chain of light nonlinear oscillators analyzed through a continuous approximation

    NASA Astrophysics Data System (ADS)

    Charlemagne, S.; Ture Savadkoohi, A.; Lamarque, C.-H.

    2018-07-01

    The continuous approximation is used in this work to describe the dynamics of a nonlinear chain of light oscillators coupled to a linear main system. A general methodology is applied to an example where the chain has local nonlinear restoring forces. The slow invariant manifold is detected at fast time scale. At slow time scale, equilibrium and singular points are sought around this manifold in order to predict periodic regimes and strongly modulated responses of the system. Analytical predictions are in good accordance with numerical results and represent a potent tool for designing nonlinear chains for passive control purposes.

  5. An intelligent advisory system for pre-launch processing

    NASA Technical Reports Server (NTRS)

    Engrand, Peter A.; Mitchell, Tami

    1991-01-01

    The shuttle system of interest in this paper is the shuttle's data processing system (DPS). The DPS is composed of the following: (1) general purpose computers (GPC); (2) a multifunction CRT display system (MCDS); (3) mass memory units (MMU); and (4) a multiplexer/demultiplexer (MDM) and related software. In order to ensure the correct functioning of shuttle systems, some level of automatic error detection has been incorporated into all shuttle systems. For the DPS, error detection equipment has been incorporated into all of its subsystems. The automated diagnostic system, (MCDS) diagnostic tool, that aids in a more efficient processing of the DPS is described.

  6. Development and validation of a general purpose linearization program for rigid aircraft models

    NASA Technical Reports Server (NTRS)

    Duke, E. L.; Antoniewicz, R. F.

    1985-01-01

    A FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft models is discussed. The program LINEAR numerically determines a linear systems model using nonlinear equations of motion and a user-supplied, nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model. Also, included in the report is a comparison of linear and nonlinear models for a high performance aircraft.

  7. Group decision-making techniques for natural resource management applications

    USGS Publications Warehouse

    Coughlan, Beth A.K.; Armour, Carl L.

    1992-01-01

    This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.

  8. Student Self-evaluation After Nursing Examinations: That's a Wrap.

    PubMed

    Butzlaff, Alice; Gaylle, Debrayh; O'Leary Kelley, Colleen

    2018-04-13

    Examination wrappers are a self-evaluation tool that uses metacognition to help students reflect on test performance. After examinations, rather than focus on points earned, students learn to self-identify study strategies and recognize methods of test preparation. The purpose of the study was to determine if the use of an examination wrapper after each test would encourage students to self-evaluate performance and adjust study strategies. A total of 120 undergraduate nursing students completed self-evaluations after each examination, which were analyzed using content analysis. Three general patterns emerged from student self-evaluation: effective and ineffective study strategies, understanding versus memorization of content, and nurse educator assistance.

  9. Interviews with the Apollo lunar surface astronauts in support of planning for EVA systems design

    NASA Technical Reports Server (NTRS)

    Connors, Mary M.; Eppler, Dean B.; Morrow, Daniel G.

    1994-01-01

    Focused interviews were conducted with the Apollo astronauts who landed on the moon. The purpose of these interviews was to help define extravehicular activity (EVA) system requirements for future lunar and planetary missions. Information from the interviews was examined with particular attention to identifying areas of consensus, since some commonality of experience is necessary to aid in the design of advanced systems. Results are presented under the following categories: mission approach; mission structure; suits; portable life support systems; dust control; gloves; automation; information, displays, and controls; rovers and remotes; tools; operations; training; and general comments. Research recommendations are offered, along with supporting information.

  10. f(T,R) theory of gravity

    NASA Astrophysics Data System (ADS)

    Salti, Mustafa; Korunur, Murat; Acikgoz, Irfan; Pirinccioglu, Nurettin; Binbay, Figen

    We mainly focus on the idea that the dynamics of the whole universe may be understood by making use of torsion T and curvature R at the same time. The f(T,R)-gravity can be considered as a fundamental gravitational theory describing the evolution of the universe. The model can produce the unification of the general relativity (GR), teleparallel gravity (TPG), f(R)-gravity and f(T)-gravity theories. For this purpose, the corresponding Lagrangian density is written in terms of an arbitrary function of the torsion and curvature scalars. Furthermore, we use the absence/existence puzzle of relativistic neutron stars and thermodynamical laws as constraining tools for the new proposal.

  11. Genetics on the World Wide Web.

    PubMed

    Trangenstein, P A; Hetteberg, C

    1998-11-01

    Since 1990, when the Human Genome Project was initiated, the amount of genetic information on the World Wide Web (WWW) has grown substantially. The WWW has become an important resource for current, accurate, and reliable genetic information for health care professionals and the general public. The purpose of this article is to provide a variety of genetics-related WWW sites that are useful for all levels of practitioners interested in genetics. In selecting sites to be included in this article, a number of evaluation tools were reviewed. The primary concern was that these sites be reputable and provide accurate, timely information. A table of the WWW sites is included for quick easy reference.

  12. Multiwire Gamma Camera for Radionuclide and Radiographic Imaging in the Space environment

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L.

    1985-01-01

    Unique multiwire proportional counter technology has been developed at the Johnson Space Center over the past several years. The technology will be described and how it may apply both in near- and long-term NASA efforts. In the near-term, I feel that the technology will provide a significant tool for the cardiovascular research area. In particular, low-dose nuclear medicine and tissue densitometry techniques of expanded scope will be supplied. In the longer term, the multiwire technique can provide a general purpose radiology and nuclear medicine facility for use in the space station which would be difficult and costly to provide by other means.

  13. A Module Language for Typing by Contracts

    NASA Technical Reports Server (NTRS)

    Glouche, Yann; Talpin, Jean-Pierre; LeGuernic, Paul; Gautier, Thierry

    2009-01-01

    Assume-guarantee reasoning is a popular and expressive paradigm for modular and compositional specification of programs. It is becoming a fundamental concept in some computer-aided design tools for embedded system design. In this paper, we elaborate foundations for contract-based embedded system design by proposing a general-purpose module language based on a Boolean algebra allowing to define contracts. In this framework, contracts are used to negotiate the correctness of assumptions made on the definition of a component at the point where it is used and provides guarantees to its environment. We illustrate this presentation with the specification of a simplified 4-stroke engine model.

  14. Risk assessment techniques with applicability in marine engineering

    NASA Astrophysics Data System (ADS)

    Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.

    2015-11-01

    Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.

  15. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  16. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  17. Need a Special Tool? Make It Yourself!

    ERIC Educational Resources Information Center

    Mordini, Robert D.

    2007-01-01

    People seem to have created a tool for every purpose. If a person searches diligently, he can usually find the tool he needs. However, several things may affect this process such as time, cost of the tool, and limited tool sources. The solution to all these is to make the tool yourself. People have made tools for many thousands of years, and with…

  18. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  19. Is video review of patient encounters an effective tool for medical student learning? A review of the literature

    PubMed Central

    Hammoud, Maya M; Morgan, Helen K; Edwards, Mary E; Lyon, Jennifer A; White, Casey

    2012-01-01

    Purpose To determine if video review of student performance during patient encounters is an effective tool for medical student learning. Methods Multiple bibliographic databases that include medical, general health care, education, psychology, and behavioral science literature were searched for the following terms: medical students, medical education, undergraduate medical education, education, self-assessment, self-evaluation, self-appraisal, feedback, videotape, video recording, televised, and DVD. The authors examined all abstracts resulting from this search and reviewed the full text of the relevant articles as well as additional articles identified in the reference lists of the relevant articles. Studies were classified by year of student (preclinical or clinical) and study design (controlled or non-controlled). Results A total of 67 articles met the final search criteria and were fully reviewed. Most studies were non-controlled and performed in the clinical years. Although the studies were quite variable in quality, design, and outcomes, in general video recording of performance and subsequent review by students with expert feedback had positive outcomes in improving feedback and ultimate performance. Video review with self-assessment alone was not found to be generally effective, but when linked with expert feedback it was superior to traditional feedback alone. Conclusion There are many methods for integrating effective use of video-captured performance into a program of learning. We recommend combining student self-assessment with feedback from faculty or other trained individuals for maximum effectiveness. We also recommend additional research in this area. PMID:23761999

  20. 1 CFR 2.1 - Scope and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Scope and purpose. 2.1 Section 2.1 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER GENERAL GENERAL INFORMATION § 2.1 Scope and purpose. (a) This chapter sets forth the policies, procedures, and delegations under which the...

  1. 1 CFR 2.1 - Scope and purpose.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 1 General Provisions 1 2011-01-01 2011-01-01 false Scope and purpose. 2.1 Section 2.1 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER GENERAL GENERAL INFORMATION § 2.1 Scope and purpose. (a) This chapter sets forth the policies, procedures, and delegations under which the...

  2. 1 CFR 2.1 - Scope and purpose.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 1 General Provisions 1 2014-01-01 2012-01-01 true Scope and purpose. 2.1 Section 2.1 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER GENERAL GENERAL INFORMATION § 2.1 Scope and purpose. (a) This chapter sets forth the policies, procedures, and delegations under which the...

  3. 1 CFR 2.1 - Scope and purpose.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 1 General Provisions 1 2012-01-01 2012-01-01 false Scope and purpose. 2.1 Section 2.1 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER GENERAL GENERAL INFORMATION § 2.1 Scope and purpose. (a) This chapter sets forth the policies, procedures, and delegations under which the...

  4. 1 CFR 2.1 - Scope and purpose.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 1 General Provisions 1 2013-01-01 2012-01-01 true Scope and purpose. 2.1 Section 2.1 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER GENERAL GENERAL INFORMATION § 2.1 Scope and purpose. (a) This chapter sets forth the policies, procedures, and delegations under which the...

  5. Affordances of instrumentation in general chemistry laboratories

    NASA Astrophysics Data System (ADS)

    Sherman, Kristin Mary Daniels

    The purpose of this study is to find out what students in the first chemistry course at the undergraduate level (general chemistry for science majors) know about the affordances of instrumentation used in the general chemistry laboratory and how their knowledge develops over time. Overall, students see the PASCO(TM) system as a useful and accurate measuring tool for general chemistry labs. They see the probeware as easy to use, portable, and able to interact with computers. Students find that the PASCO(TM) probeware system is useful in their general chemistry labs, more advanced chemistry labs, and in other science classes, and can be used in a variety of labs done in general chemistry. Students learn the affordances of the probeware through the lab manual, the laboratory teaching assistant, by trial and error, and from each other. The use of probeware systems provides lab instructors the opportunity to focus on the concepts illustrated by experiments and the opportunity to spend time discussing the results. In order to teach effectively, the instructor must know the correct name of the components involved, how to assemble and disassemble it correctly, how to troubleshoot the software, and must be able to replace broken or missing components quickly. The use of podcasts or Web-based videos should increase student understanding of affordances of the probeware.

  6. Weak ergodicity of population evolution processes.

    PubMed

    Inaba, H

    1989-10-01

    The weak ergodic theorems of mathematical demography state that the age distribution of a closed population is asymptotically independent of the initial distribution. In this paper, we provide a new proof of the weak ergodic theorem of the multistate population model with continuous time. The main tool to attain this purpose is a theory of multiplicative processes, which was mainly developed by Garrett Birkhoff, who showed that ergodic properties generally hold for an appropriate class of multiplicative processes. First, we construct a general theory of multiplicative processes on a Banach lattice. Next, we formulate a dynamical model of a multistate population and show that its evolution operator forms a multiplicative process on the state space of the population. Subsequently, we investigate a sufficient condition that guarantees the weak ergodicity of the multiplicative process. Finally, we prove the weak and strong ergodic theorems for the multistate population and resolve the consistency problem.

  7. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  8. A High-Level Language for Rule-Based Modelling

    PubMed Central

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D.

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208

  9. Crossing the line--learning psychiatry at the movies.

    PubMed

    Akram, Adil; O'Brien, Aileen; O'Neill, Aidan; Latham, Richard

    2009-06-01

    Special Study Modules (SSMs) have developed in response to the General Medical Council's recommendations. St George's, University of London runs a 'Psychiatry and Film' SSM for medical students on the 5-year MBBS course. Many films have plots or characters that have a mental illness. Psychiatry & filmmaking share certain skills. Both seek to understand character, motivation and behaviour. Cinema therefore has the potential to be a useful tool for medical educational purposes. Specific to psychiatry, themes such as the accuracy of portrayals of different mental illness, the psychiatrist/patient relationship and living with a mental illness can be explored. General issues such as the role of the psychiatrist in society, medical ethics, professionalism and stigma can also be usefully highlighted for consideration and debate. This may encourage medical students to consider psychiatry as a potential career specialty and help reduce negative attitudes to mental illness.

  10. Darwin v. 2.0: an interpreted computer language for the biosciences.

    PubMed

    Gonnet, G H; Hallett, M T; Korostensky, C; Bernardin, L

    2000-02-01

    We announce the availability of the second release of Darwin v. 2.0, an interpreted computer language especially tailored to researchers in the biosciences. The system is a general tool applicable to a wide range of problems. This second release improves Darwin version 1.6 in several ways: it now contains (1) a larger set of libraries touching most of the classical problems from computational biology (pairwise alignment, all versus all alignments, tree construction, multiple sequence alignment), (2) an expanded set of general purpose algorithms (search algorithms for discrete problems, matrix decomposition routines, complex/long integer arithmetic operations), (3) an improved language with a cleaner syntax, (4) better on-line help, and (5) a number of fixes to user-reported bugs. Darwin is made available for most operating systems free of char ge from the Computational Biochemistry Research Group (CBRG), reachable at http://chrg.inf.ethz.ch. darwin@inf.ethz.ch

  11. Design of two-channel oscilloscope and basic circuit simulations in LabView

    NASA Astrophysics Data System (ADS)

    Balzhiev, Plamen; Makal, Jaroslaw

    2008-01-01

    The project is realized as a diploma thesis in Bialystok Technical University, Poland). The main aim is to develop a useful educational tool which presents the time and frequency characteristics in basic electrical circuits. It is designed as a helpful instrument for lectures and laboratory classes. The predominant audience will be students of electrical engineering from first semester of the higher education. Therefore the level of knowledge at this stage of education is not high enough and different techniques are necessary to increase the students' interest and the efficiency of teaching process. This educational instrument provides the needed knowledge concerning the basic circuits and its parameters. Graphics and animations of the general processes in the electrical circuits make the problems more interesting, comprehensive and easier to understand. For designing such an instrument the National Instruments' programming environment LabView is used. It is preferred to the other simulation software because of its simplicity flexibility and also availability (the free demo version is sufficient to make a simple virtual instrument). LabView uses graphical programming language and has powerful mathematical functions for analysis and simulations. The useful visualization tools for presenting different diagrams are worth recommending, too. It is also specialized in measurement and control and it supports a wide variety of hardware. Therefore this software is suitable for laboratory classes to present the dependencies between the simulated characteristics in basic electrical circuits and the real one measured with the hardware device. For this purpose a two-channel oscilloscope is designed as part of the described project. The main purpose of this instrument as part of the educational process is to present the desired characteristics of the electrical circuits and to become familiar with the general functions of the oscilloscope. This project combines several important features appropriate for teaching purposes: well presented information with graphics, easy to operate with and giving the necessary knowledge. This method of teaching is more interesting and attractive to the audience. Also the information is assimilated more quickly, with less effort.

  12. AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves

    NASA Astrophysics Data System (ADS)

    Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.

    2017-02-01

    ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.

  13. A simulated training model for laparoscopic pyloromyotomy: Is 3D printing the way of the future?

    PubMed

    Williams, Andrew; McWilliam, Morgan; Ahlin, James; Davidson, Jacob; Quantz, Mackenzie A; Bütter, Andreana

    2018-05-01

    Hypertrophic pyloric stenosis (HPS) is a common neonatal condition treated with open or laparoscopic pyloromyotomy. 3D-printed organs offer realistic simulations to practice surgical techniques. The purpose of this study was to validate a 3D HPS stomach model and assess model reliability and surgical realism. Medical students, general surgery residents, and adult and pediatric general surgeons were recruited from a single center. Participants were videotaped three times performing a laparoscopic pyloromyotomy using box trainers and 3D-printed stomachs. Attempts were graded independently by three reviewers using GOALS and Task Specific Assessments (TSA). Participants were surveyed using the Index of Agreement of Assertions on Model Accuracy (IAAMA). Participants reported their experience levels as novice (22%), inexperienced (26%), intermediate (19%), and experienced (33%). Interrater reliability was similar for overall average GOALS and TSA scores. There was a significant improvement in GOALS (p<0.0001) and TSA scores (p=0.03) between attempts and overall. Participants felt the model accurately simulated a laparoscopic pyloromyotomy (82%) and would be a useful tool for beginners (100%). A 3D-printed stomach model for simulated laparoscopic pyloromyotomy is a useful training tool for learners to improve laparoscopic skills. The GOALS and TSA provide reliable technical skills assessments. II. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Examining Wikipedia's Value as an Information Source Using the California State University-Chico Website Evaluation Guidelines

    ERIC Educational Resources Information Center

    Upchurch, John

    2011-01-01

    The purpose of this work is to examine Wikipedia's role as a tool for instruction in website evaluation. Wikipedia's purpose, structural elements and potential failings as an authoritative information source are examined. Also presented are rationales for using Wikipedia as an instructional tool, namely the overwhelming popularity of Wikipedia.…

  15. "Thinking about Drinking": Exploring Children's Perceptions of Alcohol Using the Draw and Write Tool

    ERIC Educational Resources Information Center

    Farmer, Siobhan; Porcellato, Lorna

    2016-01-01

    Purpose: The purpose of this paper is to explore perceptions of alcohol held by schoolchildren using the "Draw and Write" tool, to inform the planning of alcohol education in the classroom setting. Design/methodology/approach: A specifically designed "Draw and Write" booklet was used with 169 children aged nine to ten years…

  16. Higher Education Institution Sustainability Assessment Tools: Considerations on Their Use in Brazil

    ERIC Educational Resources Information Center

    de Araújo Góes, Heloisa Cronemberger; Magrini, Alessandra

    2016-01-01

    Purpose: The purpose of this paper is to gather elements to propose a sustainability assessment tool (SAT) to be used in higher education institutions (HEIs) in Brazil and the related program to be created for SAT dissemination and HEI monitoring, publication of results and benchmarking. Design/methodology/approach: The characteristics of eight…

  17. Role of Social Software Tools in Education: A Literature Review

    ERIC Educational Resources Information Center

    Minocha, Shailey

    2009-01-01

    Purpose: The purpose of this paper is to provide a review of literature on the role of Web 2.0 or social software tools in education. Design/methodology/approach: This paper is a critical and comprehensive review of a range of literature sources (until January 2009) addressing the various issues related to the educator's perspective of pedagogical…

  18. Peer Coaching as an Institutionalised Tool for Professional Development: The Perceptions of Tutors in a Nigerian College

    ERIC Educational Resources Information Center

    Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi

    2013-01-01

    Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…

  19. E-Portfolio, a Valuable Job Search Tool for College Students

    ERIC Educational Resources Information Center

    Yu, Ti

    2012-01-01

    Purpose: The purpose of this paper is to find answers to the following questions: How do employers think about e-portfolios? Do employers really see e-portfolios as a suitable hiring tool? Which factors in students' e-portfolios attract potential employers? Can e-portfolios be successfully used by students in their search for a job?…

  20. The Norwegian National Summary Care Record: a qualitative analysis of doctors' use of and trust in shared patient information.

    PubMed

    Dyb, Kari; Warth, Line Lundvoll

    2018-04-06

    This paper explores Norwegian doctors' use of and experiences with a national tool for sharing core patient health information. The summary care record (SCR; the Kjernejournal in Norwegian) is the first national system for sharing patient information among the various levels and institutions of health care throughout the country. The health authorities have invested heavily in the development, implementation and deployment of this tool, and as of 2017 all Norwegian citizens have a personalised SCR. However, as there remains limited knowledge about health professionals' use of, experiences with and opinions regarding this new tool, the purpose of this study was to explore doctors' direct SCR experiences. We conducted 25 in-depth interviews with 10 doctors from an emergency ward, 5 doctors from an emergency clinic and 10 doctors from 5 general practitioner offices. We then transcribed, thematically coded and analysed the interviews utilising a grounded theory approach. The SCRs contain several features for providing core patient information that is particularly relevant in acute or emergency situations; nonetheless, we found that the doctors generally used only one of the tool's six functions, namely, the pharmaceutical summary. In addition, they primarily used this summary for a few subgroups of patients, including in the emergency ward for unconscious patients, for elderly patients with multiple prescriptions and for patients with substance abuse conditions. The primary difference of the pharmaceutical summary compared with the other functions of the tool is that patient information is automatically updated from a national pharmaceutical server, while other clinically relevant functions, like the critical information category, require manual updates by the health professionals themselves, thereby potentially causing variations in the accuracy, completeness and trustworthiness of the data. Therefore, we can assume that the popularity of the pharmaceutical summary among doctors is based on their preference to place their trust in - and therefore more often utilise - automatically updated information. In addition, the doctors' lack of trust in manually updated information might have severe implications for the future success of the SCR and for similar digital tools for sharing patient information.

  1. 7 CFR 226.1 - General purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 226.1 Section 226.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS CHILD AND ADULT CARE FOOD PROGRAM General § 226.1 General purpose and scope. This part announces the...

  2. 7 CFR 225.1 - General purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 225.1 Section 225.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SUMMER FOOD SERVICE PROGRAM General § 225.1 General purpose and scope. This part establishes the regulations...

  3. Design mentoring tool.

    DOT National Transportation Integrated Search

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers : mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves se...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, Andy; /Edinburgh U.; Butterworth, Jonathan

    We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays;more » the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the Ariadne, Herwig++, Pythia 8 and Sherpa generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists wanting a deeper insight into the tools available for signal and background simulation at the LHC.« less

  5. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  6. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  7. Access, use and perceptions regarding Internet, cell phones and PDAs as a means for health promotion for people living with HIV in Peru

    PubMed Central

    Curioso, Walter H; Kurth, Ann E

    2007-01-01

    Background Internet tools, cell phones, and other information and communication technologies are being used by HIV-positive people on their own initiative. Little is known about the perceptions of HIV-positive people towards these technologies in Peru. The purpose of this paper is to report on perceptions towards use of information and communication technologies as a means to support antiretroviral medication adherence and HIV transmission risk reduction. Methods We conducted a qualitative study (in-depth interviews) among adult people living with HIV in two community-based clinics in Peru. Results 31 HIV-positive individuals in Lima were interviewed (n = 28 men, 3 women). People living with HIV in Peru are using tools such as cell phones, and the Internet (via E-mail, chat, list-serves) to support their HIV care and to make social and sexual connections. In general, they have positive perceptions about using the Internet, cell phones and PDAs for HIV health promotion interventions. Conclusion Health promotion interventions using information and communication technology tools among people living with HIV in resource-constrained settings may be acceptable and feasible, and can build on existing patterns of use. PMID:17850656

  8. Access, use and perceptions regarding Internet, cell phones and PDAs as a means for health promotion for people living with HIV in Peru.

    PubMed

    Curioso, Walter H; Kurth, Ann E

    2007-09-12

    Internet tools, cell phones, and other information and communication technologies are being used by HIV-positive people on their own initiative. Little is known about the perceptions of HIV-positive people towards these technologies in Peru. The purpose of this paper is to report on perceptions towards use of information and communication technologies as a means to support antiretroviral medication adherence and HIV transmission risk reduction. We conducted a qualitative study (in-depth interviews) among adult people living with HIV in two community-based clinics in Peru. 31 HIV-positive individuals in Lima were interviewed (n = 28 men, 3 women). People living with HIV in Peru are using tools such as cell phones, and the Internet (via E-mail, chat, list-serves) to support their HIV care and to make social and sexual connections. In general, they have positive perceptions about using the Internet, cell phones and PDAs for HIV health promotion interventions. Health promotion interventions using information and communication technology tools among people living with HIV in resource-constrained settings may be acceptable and feasible, and can build on existing patterns of use.

  9. 29 CFR 1910.242 - Hand and portable powered tools and equipment, general.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to less than 30 p.s.i. and then only with effective chip guarding and personal protective equipment. ... 29 Labor 5 2011-07-01 2011-07-01 false Hand and portable powered tools and equipment, general... Powered Tools and Other Hand-Held Equipment § 1910.242 Hand and portable powered tools and equipment...

  10. 29 CFR 1910.242 - Hand and portable powered tools and equipment, general.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to less than 30 p.s.i. and then only with effective chip guarding and personal protective equipment. ... 29 Labor 5 2010-07-01 2010-07-01 false Hand and portable powered tools and equipment, general... Powered Tools and Other Hand-Held Equipment § 1910.242 Hand and portable powered tools and equipment...

  11. A Performance-Based Web Budget Tool

    ERIC Educational Resources Information Center

    Abou-Sayf, Frank K.; Lau, Wilson

    2007-01-01

    A web-based formula-driven tool has been developed for the purpose of performing two distinct academic department budgeting functions: allocation funding to the department, and budget management by the department. The tool's major features are discussed and its uses demonstrated. The tool's advantages are presented. (Contains 10 figures.)

  12. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  13. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  14. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  15. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 2 2013-10-01 2013-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  16. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 2 2012-10-01 2012-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  17. 32 CFR 644.134 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... buildings, including land incidental thereto, suitable for the general use of Government agencies, including...) Special-purpose space is space in buildings, including land incidental thereto, wholly or predominantly utilized for the special purposes of an agency, and not generally suitable for general-purpose use...

  18. 26 CFR 1.355-0 - Outline of sections.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... (b) Independent business purpose. (1) Independent business purpose requirement. (2) Corporate business purpose. (3) Business purpose for distribution. (4) Business purpose as evidence of nondevice. (5... distribution of earnings and profits. (1) In general. (2) Device factors. (i) In general. (ii) Pro rata...

  19. 26 CFR 1.355-0 - Outline of sections.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    .... (b) Independent business purpose. (1) Independent business purpose requirement. (2) Corporate business purpose. (3) Business purpose for distribution. (4) Business purpose as evidence of nondevice. (5... distribution of earnings and profits. (1) In general. (2) Device factors. (i) In general. (ii) Pro rata...

  20. Cafe Variome: general-purpose software for making genotype-phenotype data discoverable in restricted or open access contexts.

    PubMed

    Lancaster, Owen; Beck, Tim; Atlan, David; Swertz, Morris; Thangavelu, Dhiwagaran; Veal, Colin; Dalgleish, Raymond; Brookes, Anthony J

    2015-10-01

    Biomedical data sharing is desirable, but problematic. Data "discovery" approaches-which establish the existence rather than the substance of data-precisely connect data owners with data seekers, and thereby promote data sharing. Cafe Variome (http://www.cafevariome.org) was therefore designed to provide a general-purpose, Web-based, data discovery tool that can be quickly installed by any genotype-phenotype data owner, or network of data owners, to make safe or sensitive content appropriately discoverable. Data fields or content of any type can be accommodated, from simple ID and label fields through to extensive genotype and phenotype details based on ontologies. The system provides a "shop window" in front of data, with main interfaces being a simple search box and a powerful "query-builder" that enable very elaborate queries to be formulated. After a successful search, counts of records are reported grouped by "openAccess" (data may be directly accessed), "linkedAccess" (a source link is provided), and "restrictedAccess" (facilitated data requests and subsequent provision of approved records). An administrator interface provides a wide range of options for system configuration, enabling highly customized single-site or federated networks to be established. Current uses include rare disease data discovery, patient matchmaking, and a Beacon Web service. © 2015 WILEY PERIODICALS, INC.

  1. Construction of measurement uncertainty profiles for quantitative analysis of genetically modified organisms based on interlaboratory validation data.

    PubMed

    Macarthur, Roy; Feinberg, Max; Bertheau, Yves

    2010-01-01

    A method is presented for estimating the size of uncertainty associated with the measurement of products derived from genetically modified organisms (GMOs). The method is based on the uncertainty profile, which is an extension, for the estimation of uncertainty, of a recent graphical statistical tool called an accuracy profile that was developed for the validation of quantitative analytical methods. The application of uncertainty profiles as an aid to decision making and assessment of fitness for purpose is also presented. Results of the measurement of the quantity of GMOs in flour by PCR-based methods collected through a number of interlaboratory studies followed the log-normal distribution. Uncertainty profiles built using the results generally give an expected range for measurement results of 50-200% of reference concentrations for materials that contain at least 1% GMO. This range is consistent with European Network of GM Laboratories and the European Union (EU) Community Reference Laboratory validation criteria and can be used as a fitness for purpose criterion for measurement methods. The effect on the enforcement of EU labeling regulations is that, in general, an individual analytical result needs to be < 0.45% to demonstrate compliance, and > 1.8% to demonstrate noncompliance with a labeling threshold of 0.9%.

  2. Design and implementation of a general main axis controller for the ESO telescopes

    NASA Astrophysics Data System (ADS)

    Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas

    2012-09-01

    Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.

  3. A Field Test of Web-Based Screening for Dry Eye Disease to Enhance Awareness of Eye Problems Among General Internet Users: A Latent Strategy to Promote Health

    PubMed Central

    Uchino, Miki; Kawazoe, Takashi; Kamiyashiki, Masaaki; Sano, Kokoro; Tsubota, Kazuo

    2013-01-01

    Background A Web-based self-check system including a brief questionnaire would seem to be a suitable tool for rapid disease screening. Objective The purpose of this preliminary study was to test a Web-based self-screening questionnaire for drawing attention to dry eye disease among general Internet users and identifying those with a higher risk of developing the condition. Methods A survey website was launched and used to recruit participants from general Internet users. In the first phase, volunteers were asked to complete a Web-based self-screening questionnaire containing 12 questions on dry eye symptoms. The second phase focused on the respondents who reported five or more dry eye symptoms and expressed their intention to seek medical attention. These participants performed the Schirmer test, for evaluating tear production, and completed a paper-based lifestyle questionnaire to provide relevant background data. Results Of the 1689 visitors to the website, 980 (58.0%) volunteers completed the Web-based self-screening questionnaire. Among these, 355 (36.2%) respondents reported five or more dry eye symptoms. Then, 99 (27.9%) of the symptomatic participants performed the Schirmer test and completed the paper-based lifestyle questionnaire. Out of these, 32 (32.2%) had abnormal tear production (≤5 mm). Conclusions The proposed Web-based self-screening questionnaire seems to be a promising tool for raising awareness of dry eye disease among general Internet users and identifying those with a higher risk of developing the condition, although further research is needed to validate its effectiveness. PMID:24072379

  4. LAILAPS-QSM: A RESTful API and JAVA library for semantic query suggestions.

    PubMed

    Chen, Jinbo; Scholz, Uwe; Zhou, Ruonan; Lange, Matthias

    2018-03-01

    In order to access and filter content of life-science databases, full text search is a widely applied query interface. But its high flexibility and intuitiveness is paid for with potentially imprecise and incomplete query results. To reduce this drawback, query assistance systems suggest those combinations of keywords with the highest potential to match most of the relevant data records. Widespread approaches are syntactic query corrections that avoid misspelling and support expansion of words by suffixes and prefixes. Synonym expansion approaches apply thesauri, ontologies, and query logs. All need laborious curation and maintenance. Furthermore, access to query logs is in general restricted. Approaches that infer related queries by their query profile like research field, geographic location, co-authorship, affiliation etc. require user's registration and its public accessibility that contradict privacy concerns. To overcome these drawbacks, we implemented LAILAPS-QSM, a machine learning approach that reconstruct possible linguistic contexts of a given keyword query. The context is referred from the text records that are stored in the databases that are going to be queried or extracted for a general purpose query suggestion from PubMed abstracts and UniProt data. The supplied tool suite enables the pre-processing of these text records and the further computation of customized distributed word vectors. The latter are used to suggest alternative keyword queries. An evaluated of the query suggestion quality was done for plant science use cases. Locally present experts enable a cost-efficient quality assessment in the categories trait, biological entity, taxonomy, affiliation, and metabolic function which has been performed using ontology term similarities. LAILAPS-QSM mean information content similarity for 15 representative queries is 0.70, whereas 34% have a score above 0.80. In comparison, the information content similarity for human expert made query suggestions is 0.90. The software is either available as tool set to build and train dedicated query suggestion services or as already trained general purpose RESTful web service. The service uses open interfaces to be seamless embeddable into database frontends. The JAVA implementation uses highly optimized data structures and streamlined code to provide fast and scalable response for web service calls. The source code of LAILAPS-QSM is available under GNU General Public License version 2 in Bitbucket GIT repository: https://bitbucket.org/ipk_bit_team/bioescorte-suggestion.

  5. Generation Y, Learner Autonomy and the Potential of Web 2.0 Tools for Language Learning and Teaching

    ERIC Educational Resources Information Center

    Morgan, Liam

    2012-01-01

    Purpose: The purpose of this paper is to examine the relationship between the development of learner autonomy and the application of Web 2.0 tools in the language classroom. Design/methodology/approach: The approach taken is that of qualitative action research within an explicit theoretical framework and the data were collected via surveys and…

  6. National Tests in Denmark--CAT as a Pedagogic Tool

    ERIC Educational Resources Information Center

    Wandall, Jakob

    2011-01-01

    Testing and test results can be used in different ways. They can be used for regulation and control, but they can also be a pedagogic tool for assessment of student proficiency in order to target teaching, improve learning and facilitate local pedagogical leadership. To serve these purposes the test has to be used for low stakes purposes, and to…

  7. The Tools of the Web Assisted Foreign Language Instruction

    ERIC Educational Resources Information Center

    Uzunboylu, Huseyin

    2005-01-01

    The purpose of this study was to review the asynchronous and synchronous tools of the Web assisted foreign language instruction. This study was conducted on the base of literature survey, so the findings was interpreted and evaluated for the purpose of the study. In the study, firstly, we were preferred to give a brief description of the each Web…

  8. Management Perception of Introducing Social Networking Sites as a Knowledge Management Tool in Higher Education: A Case Study

    ERIC Educational Resources Information Center

    Garcia, Elaine; Annansingh, Fenio; Elbeltagi, Ibrahim

    2011-01-01

    Purpose: The purpose of this paper is to present a study of the understanding and usage of social networking sites (SNS) as a knowledge management (KM) tool in knowledge-intensive enterprises. Design/methodology/approach: In terms of research approach, the study has taken an interpretitivist framework, using a higher education (HE) institution as…

  9. Designing Web 2.0 Based Constructivist-Oriented E-Learning Units

    ERIC Educational Resources Information Center

    Chai, Ching Sing; Woo, Huay Lit; Wang, Qiyun

    2010-01-01

    Purpose: The main purpose of this paper is to present how meaningful e-learning units can be created by using an online tool called Meaningful E-learning Units (MeLU). The paper also aims to describe how created e-learning units can be shared by teachers and students. Design/methodology/approach: This tool can help to produce e-learning units that…

  10. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  11. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  12. Lifting the veil on the dynamics of neuronal activities evoked by transcranial magnetic stimulation

    PubMed Central

    Li, Bingshuo; Virtanen, Juha P; Oeltermann, Axel; Schwarz, Cornelius; Giese, Martin A; Ziemann, Ulf

    2017-01-01

    Transcranial magnetic stimulation (TMS) is a widely used non-invasive tool to study and modulate human brain functions. However, TMS-evoked activity of individual neurons has remained largely inaccessible due to the large TMS-induced electromagnetic fields. Here, we present a general method providing direct in vivo electrophysiological access to TMS-evoked neuronal activity 0.8–1 ms after TMS onset. We translated human single-pulse TMS to rodents and unveiled time-grained evoked activities of motor cortex layer V neurons that show high-frequency spiking within the first 6 ms depending on TMS-induced current orientation and a multiphasic spike-rhythm alternating between excitation and inhibition in the 6–300 ms epoch, all of which can be linked to various human TMS responses recorded at the level of spinal cord and muscles. The advance here facilitates a new level of insight into the TMS-brain interaction that is vital for developing this non-invasive tool to purposefully explore and effectively treat the human brain. PMID:29165241

  13. qDIET: toward an automated, self-sustaining knowledge base to facilitate linking point-of-sale grocery items to nutritional content

    PubMed Central

    Chidambaram, Valliammai; Brewster, Philip J.; Jordan, Kristine C.; Hurdle, John F.

    2013-01-01

    The United States, indeed the world, struggles with a serious obesity epidemic. The costs of this epidemic in terms of healthcare dollar expenditures and human morbidity/mortality are staggering. Surprisingly, clinicians are ill-equipped in general to advise patients on effective, longitudinal weight loss strategies. We argue that one factor hindering clinicians and patients in effective shared decision-making about weight loss is the absence of a metric that can be reasoned about and monitored over time, as clinicians do routinely with, say, serum lipid levels or HgA1C. We propose that a dietary quality measure championed by the USDA and NCI, the HEI-2005/2010, is an ideal metric for this purpose. We describe a new tool, the quality Dietary Information Extraction Tool (qDIET), which is a step toward an automated, self-sustaining process that can link retail grocery purchase data to the appropriate USDA databases to permit the calculation of the HEI-2005/2010. PMID:24551333

  14. TRMM Common Microphysics Products: A Tool for Evaluating Spaceborne Precipitation Retrieval Algorithms

    NASA Technical Reports Server (NTRS)

    Kingsmill, David E.; Yuter, Sandra E.; Hobbs, Peter V.; Rangno, Arthur L.; Heymsfield, Andrew J.; Stith, Jeffrey L.; Bansemer, Aaron; Haggerty, Julie A.; Korolev, Alexei V.

    2004-01-01

    A customized product for analysis of microphysics data collected from aircraft during field campaigns in support of the TRMM program is described. These Common Microphysics Products (CMP's) are designed to aid in evaluation of TRMM spaceborne precipitation retrieval algorithms. Information needed for this purpose (e.g., particle size spectra and habit, liquid and ice water content) was derived using a common processing strategy on the wide variety of microphysical instruments and raw native data formats employed in the field campaigns. The CMP's are organized into an ASCII structure to allow easy access to the data for those less familiar with and without the tools to accomplish microphysical data processing. Detailed examples of the CMP show its potential and some of its limitations. This approach may be a first step toward developing a generalized microphysics format and an associated community-oriented, non-proprietary software package for microphysics data processing, initiatives that would likely broaden community access to and use of microphysics datasets.

  15. Development of the Power Simulation Tool for Energy Balance Analysis of Nanosatellites

    NASA Astrophysics Data System (ADS)

    Kim, Eun-Jung; Sim, Eun-Sup; Kim, Hae-Dong

    2017-09-01

    The energy balance in a satellite needs to be designed properly for the satellite to safely operate and carry out successive missions on an orbit. In this study, an analysis program was developed using the MATLABⓇ graphic user interface (GUI) for nanosatellites. This program was used in a simulation to confirm the generated power, consumed power, and battery power in the satellites on the orbit, and its performance was verified with applying different satellite operational modes and units. For data transmission, STKⓇ-MATLABⓇ connectivity was used to send the generated power from STKⓇ to MATLABⓇ automatically. Moreover, this program is general-purpose; therefore, it can be applied to nanosatellites that have missions or shapes that are different from those of the satellites in this study. This power simulation tool could be used not only to calculate the suitable power budget when developing the power systems, but also to analyze the remaining energy balance in the satellites.

  16. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less

  17. Internet (WWW) based system of ultrasonic image processing tools for remote image analysis.

    PubMed

    Zeng, Hong; Fei, Ding-Yu; Fu, Cai-Ting; Kraft, Kenneth A

    2003-07-01

    Ultrasonic Doppler color imaging can provide anatomic information and simultaneously render flow information within blood vessels for diagnostic purpose. Many researchers are currently developing ultrasound image processing algorithms in order to provide physicians with accurate clinical parameters from the images. Because researchers use a variety of computer languages and work on different computer platforms to implement their algorithms, it is difficult for other researchers and physicians to access those programs. A system has been developed using World Wide Web (WWW) technologies and HTTP communication protocols to publish our ultrasonic Angle Independent Doppler Color Image (AIDCI) processing algorithm and several general measurement tools on the Internet, where authorized researchers and physicians can easily access the program using web browsers to carry out remote analysis of their local ultrasonic images or images provided from the database. In order to overcome potential incompatibility between programs and users' computer platforms, ActiveX technology was used in this project. The technique developed may also be used for other research fields.

  18. Performance Analysis of Scientific and Engineering Applications Using MPInside and TAU

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Mehrotra, Piyush; Taylor, Kenichi Jun Haeng; Shende, Sameer Suresh; Biswas, Rupak

    2010-01-01

    In this paper, we present performance analysis of two NASA applications using performance tools like Tuning and Analysis Utilities (TAU) and SGI MPInside. MITgcmUV and OVERFLOW are two production-quality applications used extensively by scientists and engineers at NASA. MITgcmUV is a global ocean simulation model, developed by the Estimating the Circulation and Climate of the Ocean (ECCO) Consortium, for solving the fluid equations of motion using the hydrostatic approximation. OVERFLOW is a general-purpose Navier-Stokes solver for computational fluid dynamics (CFD) problems. Using these tools, we analyze the MPI functions (MPI_Sendrecv, MPI_Bcast, MPI_Reduce, MPI_Allreduce, MPI_Barrier, etc.) with respect to message size of each rank, time consumed by each function, and how ranks communicate. MPI communication is further analyzed by studying the performance of MPI functions used in these two applications as a function of message size and number of cores. Finally, we present the compute time, communication time, and I/O time as a function of the number of cores.

  19. qDIET: toward an automated, self-sustaining knowledge base to facilitate linking point-of-sale grocery items to nutritional content.

    PubMed

    Chidambaram, Valliammai; Brewster, Philip J; Jordan, Kristine C; Hurdle, John F

    2013-01-01

    The United States, indeed the world, struggles with a serious obesity epidemic. The costs of this epidemic in terms of healthcare dollar expenditures and human morbidity/mortality are staggering. Surprisingly, clinicians are ill-equipped in general to advise patients on effective, longitudinal weight loss strategies. We argue that one factor hindering clinicians and patients in effective shared decision-making about weight loss is the absence of a metric that can be reasoned about and monitored over time, as clinicians do routinely with, say, serum lipid levels or HgA1C. We propose that a dietary quality measure championed by the USDA and NCI, the HEI-2005/2010, is an ideal metric for this purpose. We describe a new tool, the quality Dietary Information Extraction Tool (qDIET), which is a step toward an automated, self-sustaining process that can link retail grocery purchase data to the appropriate USDA databases to permit the calculation of the HEI-2005/2010.

  20. Friction and lubrication modelling in sheet metal forming: Influence of lubrication amount, tool roughness and sheet coating on product quality

    NASA Astrophysics Data System (ADS)

    Hol, J.; Wiebenga, J. H.; Carleer, B.

    2017-09-01

    In the stamping of automotive parts, friction and lubrication play a key role in achieving high quality products. In the development process of new automotive parts, it is therefore crucial to accurately account for these effects in sheet metal forming simulations. This paper presents a selection of results considering friction and lubrication modelling in sheet metal forming simulations of a front fender product. For varying lubrication conditions, the front fender can either show wrinkling or fractures. The front fender is modelled using different lubrication amounts, tool roughness’s and sheet coatings to show the strong influence of friction on both part quality and the overall production stability. For this purpose, the TriboForm software is used in combination with the AutoForm software. The results demonstrate that the TriboForm software enables the simulation of friction behaviour for varying lubrication conditions, i.e. resulting in a generally applicable approach for friction characterization under industrial sheet metal forming process conditions.

  1. Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view

    NASA Astrophysics Data System (ADS)

    Locati, Mario; Rovida, Andrea; Albini, Paola

    2014-05-01

    Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.

  2. Experience with Using Multiple Types of Visual Educational Tools during Problem-Based Learning.

    PubMed

    Kang, Bong Jin

    2012-06-01

    This study describes the experience of using multiple types of visual educational tools in the setting of problem-based learning (PBL). The author intends to demonstrate their roles in diverse and efficient ways of clinical reasoning and problem solving. Visual educational tools were introduced in a lecture that included their various types, possible benefits, and some examples. Each group made one mechanistic case diagram per week, and each student designed one diagnostic schema or therapeutic algorithm per week, based on their learning issues. The students were also told to provide commentary, which was intended to give insights into their truthfulness. Subsequently, the author administered a questionnaire about the usefulness and weakness of visual educational tools and the difficulties with performing the work. Also, the qualities of the products were assessed by the author. There were many complaints about the adequacy of the introduction of visual educational tools, also revealed by the many initial inappropriate types of products. However, the exercise presentation in the first week improved the level of understanding regarding their purposes and the method of design. In general, students agreed on the benefits of their help in providing a deep understanding of the cases and the possibility of solving clinical problems efficiently. The commentary was helpful in evaluating the truthfulness of their efforts. Students gave suggestions for increasing the percentage of their scores, considering the efforts. Using multiple types of visual educational tools during PBL can be useful in understanding the diverse routes of clinical reasoning and clinical features.

  3. SLUG - stochastically lighting up galaxies - III. A suite of tools for simulated photometry, spectroscopy, and Bayesian inference with stochastic stellar populations

    NASA Astrophysics Data System (ADS)

    Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan

    2015-09-01

    Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.

  4. Construction and Validation of a Holistic Education School Evaluation Tool Using Montessori Erdkinder Principles

    ERIC Educational Resources Information Center

    Setari, Anthony Philip

    2016-01-01

    The purpose of this study was to construct a holistic education school evaluation tool using Montessori Erdkinder principles, and begin the validation process of examining the proposed tool. This study addresses a vital need in the holistic education community for a school evaluation tool. The tool construction process included using Erdkinder…

  5. Design mentoring tool : [technical summary].

    DOT National Transportation Integrated Search

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves seni...

  6. 26 CFR 1.355-0 - Outline of sections.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... distributed. (b) Independent business purpose. (1) Independent business purpose requirement. (2) Corporate business purpose. (3) Business purpose for distribution. (4) Business purpose as evidence of nondevice. (5... distribution of earnings and profits. (1) In general. (2) Device factors. (i) In general. (ii) Pro rata...

  7. 26 CFR 1.355-0 - Outline of sections.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... distributed. (b) Independent business purpose. (1) Independent business purpose requirement. (2) Corporate business purpose. (3) Business purpose for distribution. (4) Business purpose as evidence of nondevice. (5... distribution of earnings and profits. (1) In general. (2) Device factors. (i) In general. (ii) Pro rata...

  8. 26 CFR 1.355-0 - Outline of sections.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... distributed. (b) Independent business purpose. (1) Independent business purpose requirement. (2) Corporate business purpose. (3) Business purpose for distribution. (4) Business purpose as evidence of nondevice. (5... distribution of earnings and profits. (1) In general. (2) Device factors. (i) In general. (ii) Pro rata...

  9. Operational and Strategic Controlling Tools in Microenterprises - Case Study

    NASA Astrophysics Data System (ADS)

    Konsek-Ciechońska, Justyna

    2017-12-01

    Globalisation and increasing requirements of the environment cause the executives and supervisors to search for more and more perfect solutions, allowing them to streamline and improve the effectiveness of company operations. One of such tools, used more and more often, is controlling, the role of which has substantially increased in the recent years. It is already implemented not only in large companies with foreign capital, but also in increasingly smaller entities, which begin to notice the positive effects of the implications of the principles and tools of controlling - both operational and strategic. The purpose of the article is to demonstrate the practical side of controlling tools that can be used for the purposes of operations conducted by microenterprises.

  10. On the general concept of buoyancy in sedimentation and ultracentrifugation.

    PubMed

    Piazza, Roberto; Buzzaccaro, Stefano; Secchi, Eleonora; Parola, Alberto

    2013-08-02

    Gravity or ultracentrifuge settling of colloidal particles and macromolecules usually involves several disperse species, either because natural and industrial colloids display a large size polydispersity, or because additives are put in on purpose to allow for density-based fractionation of the suspension. Such 'macromolecular crowding', however, may have surprising effects on sedimentation, for it strongly affects the buoyant force felt by a settling particle. Here we show that, as a matter of fact, the standard Archimedes' principle is just a limiting law, valid only for mesoscopic particles settling in a molecular fluid, and we obtain a fully general expression for the actual buoyancy force providing a microscopic basis to the general thermodynamic analysis of sedimentation in multi-component mixtures. The effective buoyancy also depends on the particle shape, being much more pronounced for thin rods and discs. Our model is successfully tested on simple colloidal mixtures, and used to predict rather unexpected effects, such as denser particles floating on top of a lighter fluid, which we actually observe in targeted experiments. This 'generalized Archimedes principle' may provide a tool to devise novel separation methods sensitive to particle size and shape.

  11. On the general concept of buoyancy in sedimentation and ultracentrifugation

    NASA Astrophysics Data System (ADS)

    Piazza, Roberto; Buzzaccaro, Stefano; Secchi, Eleonora; Parola, Alberto

    2013-08-01

    Gravity or ultracentrifuge settling of colloidal particles and macromolecules usually involves several disperse species, either because natural and industrial colloids display a large size polydispersity, or because additives are put in on purpose to allow for density-based fractionation of the suspension. Such ‘macromolecular crowding’, however, may have surprising effects on sedimentation, for it strongly affects the buoyant force felt by a settling particle. Here we show that, as a matter of fact, the standard Archimedes' principle is just a limiting law, valid only for mesoscopic particles settling in a molecular fluid, and we obtain a fully general expression for the actual buoyancy force providing a microscopic basis to the general thermodynamic analysis of sedimentation in multi-component mixtures. The effective buoyancy also depends on the particle shape, being much more pronounced for thin rods and discs. Our model is successfully tested on simple colloidal mixtures, and used to predict rather unexpected effects, such as denser particles floating on top of a lighter fluid, which we actually observe in targeted experiments. This ‘generalized Archimedes principle’ may provide a tool to devise novel separation methods sensitive to particle size and shape.

  12. Clinical use of the Surgeon General's "My Family Health Portrait" (MFHP) tool: opinions of future health care providers.

    PubMed

    Owens, Kailey M; Marvin, Monica L; Gelehrter, Thomas D; Ruffin, Mack T; Uhlmann, Wendy R

    2011-10-01

    This study examined medical students' and house officers' opinions about the Surgeon General's "My Family Health Portrait" (MFHP) tool. Participants used the tool and were surveyed about tool mechanics, potential clinical uses, and barriers. None of the 97 participants had previously used this tool. The average time to enter a family history was 15 min (range 3 to 45 min). Participants agreed or strongly agreed that the MFHP tool is understandable (98%), easy to use (93%), and suitable for general public use (84%). Sixty-seven percent would encourage their patients to use the tool; 39% would ensure staff assistance. Participants would use the tool to identify patients at increased risk for disease (86%), record family history in the medical chart (84%), recommend preventive health behaviors (80%), and refer to genetics services (72%). Concerns about use of the tool included patient access, information accuracy, technical challenges, and the need for physician education on interpreting family history information.

  13. The development and evaluation of the Screening Tool for the Assessment of Malnutrition in Paediatrics (STAMP©) for use by healthcare staff.

    PubMed

    McCarthy, H; Dixon, M; Crabtree, I; Eaton-Evans, M J; McNulty, H

    2012-08-01

    The early identification of malnutrition and nutrition risk through nutrition screening is common practice in adult clinical care but, in children, this has been hampered by the lack of an appropriate nutrition screening tool. The present study aimed to develop and evaluate a simple, child-specific nutrition screening tool for administration by non-nutrition healthcare professionals. In a two-phase observational study, significant predictors of nutrition risk were identified using a structured questionnaire. These were then combined to produce a nutrition screening tool. For evaluation purposes, the reliability, sensitivity and specificity of the newly-developed Screening Tool for the Assessment of Malnutrition in Paediatrics (STAMP(©)) were estimated by comparing the classification of nutrition risk using the tool with that determined by a full nutritional assessment by a registered dietitian. A total of 122 children were recruited for development phase and a separate cohort of 238 children was recruited for the evaluation phase. Low percentile weight for age, reported weight loss, discrepancy between weight and height percentile and recently changed appetite were all identified as predictors of nutrition risk. These predictors, together with the expected nutrition risk of clinical diagnoses, were combined to produce STAMP(©). Evaluation of STAMP(©) demonstrated fair to moderate reliability in identifying nutrition risk compared to the nutrition risk classification determined by a registered dietitian (κ = 0.541; 95% confidence interval = 0.461-0.621). Sensitivity and specificity were estimated at 70% (51-84%) and 91% (86-94%), respectively. The present study describes the development and evaluation of a new nutrition screening tool specifically for use in a UK general paediatric inpatient population. © 2012 The Authors. Journal of Human Nutrition and Dietetics © 2012 The British Dietetic Association Ltd.

  14. Balancing stability and flexibility in adaptive governance: An analysis of tools available in U.S. environmental law

    USGS Publications Warehouse

    Kundis Craig, Robin; Garmestani, Ahjond S.; Allen, Craig R.; Arnold, Craig Anthony (Tony); Birge, Hannah E.; DeCaro, Daniel A.; Fremier, Alexander K.; Gosnell, Hannah; Schlager, Edella

    2017-01-01

    Adaptive governance must work “on the ground,” that is, it must operate through structures and procedures that the people it governs perceive to be legitimate and fair, as well as incorporating processes and substantive goals that are effective in allowing social-ecological systems (SESs) to adapt to climate change and other impacts. To address the continuing and accelerating alterations that climate change is bringing to SESs, adaptive governance generally will require more flexibility than prior governance institutions have often allowed. However, to function as good governance, adaptive governance must pay real attention to the problem of how to balance this increased need for flexibility with continuing governance stability so that it can foster adaptation to change without being perceived or experienced as perpetually destabilizing, disruptive, and unfair. Flexibility and stability serve different purposes in governance, and a variety of tools exist to strike different balances between them while still preserving the governance institution’s legitimacy among the people governed. After reviewing those purposes and the implications of climate change for environmental governance, we examine psychological insights into the structuring of adaptive governance and the variety of legal tools available to incorporate those insights into adaptive governance regimes. Because the substantive goals of governance systems will differ among specific systems, we do not purport to comment on what the normative or substantive goals of law should be. Instead, we conclude that attention to process and procedure (including participation), as well as increased use of substantive standards (instead of rules), may allow an increased level of substantive flexibility to operate with legitimacy and fairness, providing the requisite levels of psychological, social, and economic stability needed for communities to adapt successfully to the Anthropocene.

  15. The laboratory report: A pedagogical tool in college science courses

    NASA Astrophysics Data System (ADS)

    Ferzli, Miriam

    When viewed as a product rather than a process that aids in student learning, the lab report may become rote, busywork for both students and instructors. Students fail to see the purpose of the lab report, and instructors see them as a heavy grading load. If lab reports are taught as part of a process rather than a product that aims to "get the right answer," they may serve as pedagogical tools in college science courses. In response to these issues, an in-depth, web-based tutorial named LabWrite (www.ncsu.edu/labwrite) was developed to help students and instructors (www.ncsu.edu/labwrite/instructors) understand the purpose of the lab report as grounded in the written discourse and processes of science. The objective of this post-test only quasi-experimental study was to examine the role that in-depth instruction such as LabWrite plays in helping students to develop skills characteristic of scientifically literate individuals. Student lab reports from an introductory-level biology course at NC State University were scored for overall understanding of scientific concepts and scientific ways of thinking. The study also looked at students' attitudes toward science and lab report writing, as well as students' perceptions of lab reports in general. Significant statistical findings from this study show that students using LabWrite were able to write lab reports that showed a greater understanding of scientific investigations (p < .003) and scientific ways of thinking (p < .0001) than students receiving traditional lab report writing instruction. LabWrite also helped students develop positive attitudes toward lab reports as compared to non-LabWrite users (p < .01). Students using LabWrite seemed to perceive the lab report as a valuable tool for determining learning objectives, understanding science concepts, revisiting the lab experience, and documenting their learning.

  16. Balancing stability and flexibility in adaptive governance: an analysis of tools available in U.S. environmental law.

    PubMed

    Craig, Robin Kundis; Garmestani, Ahjond S; Allen, Craig R; Arnold, Craig Anthony Tony; Birgé, Hannah; DeCaro, Daniel A; Fremier, Alexander K; Gosnell, Hannah; Schlager, Edella

    2017-06-30

    Adaptive governance must work "on the ground," that is, it must operate through structures and procedures that the people it governs perceive to be legitimate and fair, as well as incorporating processes and substantive goals that are effective in allowing social-ecological systems (SESs) to adapt to climate change and other impacts. To address the continuing and accelerating alterations that climate change is bringing to SESs, adaptive governance generally will require more flexibility than prior governance institutions have often allowed. However, to function as good governance, adaptive governance must pay real attention to the problem of how to balance this increased need for flexibility with continuing governance stability so that it can foster adaptation to change without being perceived or experienced as perpetually destabilizing, disruptive, and unfair. Flexibility and stability serve different purposes in governance, and a variety of tools exist to strike different balances between them while still preserving the governance institution's legitimacy among the people governed. After reviewing those purposes and the implications of climate change for environmental governance, we examine psychological insights into the structuring of adaptive governance and the variety of legal tools available to incorporate those insights into adaptive governance regimes. Because the substantive goals of governance systems will differ among specific systems, we do not purport to comment on what the normative or substantive goals of law should be. Instead, we conclude that attention to process and procedure (including participation), as well as increased use of substantive standards (instead of rules), may allow an increased level of substantive flexibility to operate with legitimacy and fairness, providing the requisite levels of psychological, social, and economic stability needed for communities to adapt successfully to the Anthropocene.

  17. An Approach for Preoperative Planning and Performance of MR-guided Interventions Demonstrated With a Manual Manipulator in a 1.5T MRI Scanner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seimenis, Ioannis; Tsekos, Nikolaos V.; Keroglou, Christoforos

    2012-04-15

    Purpose: The aim of this work was to develop and test a general methodology for the planning and performance of robot-assisted, MR-guided interventions. This methodology also includes the employment of software tools with appropriately tailored routines to effectively exploit the capabilities of MRI and address the relevant spatial limitations. Methods: The described methodology consists of: (1) patient-customized feasibility study that focuses on the geometric limitations imposed by the gantry, the robotic hardware, and interventional tools, as well as the patient; (2) stereotactic preoperative planning for initial positioning of the manipulator and alignment of its end-effector with a selected target; andmore » (3) real-time, intraoperative tool tracking and monitoring of the actual intervention execution. Testing was performed inside a standard 1.5T MRI scanner in which the MR-compatible manipulator is deployed to provide the required access. Results: A volunteer imaging study demonstrates the application of the feasibility stage. A phantom study on needle targeting is also presented, demonstrating the applicability and effectiveness of the proposed preoperative and intraoperative stages of the methodology. For this purpose, a manually actuated, MR-compatible robotic manipulation system was used to accurately acquire a prescribed target through alternative approaching paths. Conclusions: The methodology presented and experimentally examined allows the effective performance of MR-guided interventions. It is suitable for, but not restricted to, needle-targeting applications assisted by a robotic manipulation system, which can be deployed inside a cylindrical scanner to provide the required access to the patient facilitating real-time guidance and monitoring.« less

  18. Can social support work virtually? Evaluation of rheumatoid arthritis patients’ experiences with an interactive online tool

    PubMed Central

    Kostova, Zlatina; Caiata-Zufferey, Maria; Schulz, Peter J

    2015-01-01

    BACKGROUND: There is strong empirical evidence that the support that chronic patients receive from their environment is fundamental for the way they cope with physical and psychological suffering. Nevertheless, in the case of rheumatoid arthritis (RA), providing the appropriate social support is still a challenge, and such support has often proven to be elusive and unreliable in helping patients to manage the disease. OBJECTIVES: To explore whether and how social support for RA patients can be provided online, and to assess the conditions under which such support is effective. An online support tool was designed to provide patients with both tailored information and opportunities to interact online with health professionals and fellow sufferers. The general purpose was to identify where the support provided did – or did not – help patients, and to judge whether the determinants of success lay more within patients – their engagement and willingness to participate – or within the design of the website itself. METHODS: The present study reports qualitative interviews with 19 users of the tool. A more specific purpose was to elaborate qualitatively on results from a quantitative survey of users, which indicated that any positive impact was confined to practical matters of pain management rather than extending to more fundamental psychological outcomes such as acceptance. RESULTS AND CONCLUSIONS: Overall, online learning and interaction can do much to help patients with the everyday stresses of their disease; however, its potential for more durable positive impact depends on various individual characteristics such as personality traits, existing social networks, and the severity and longevity of the disease. PMID:26252664

  19. Waste in health information systems: a systematic review.

    PubMed

    Awang Kalong, Nadia; Yusof, Maryati

    2017-05-08

    Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.

  20. Generalized extreme gust wind speeds distributions

    USGS Publications Warehouse

    Cheng, E.; Yeung, C.

    2002-01-01

    Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.

  1. Integrated Aerodynamic/Structural/Dynamic Analyses of Aircraft with Large Shape Changes

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Chwalowski, Pawel; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2007-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium-to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing.

  2. BioCompoundML: A General Biofuel Property Screening Tool for Biological Molecules Using Random Forest Classifiers

    DOE PAGES

    Whitmore, Leanne S.; Davis, Ryan W.; McCormick, Robert L.; ...

    2016-09-15

    Screening a large number of biologically derived molecules for potential fuel compounds without recourse to experimental testing is important in identifying understudied yet valuable molecules. Experimental testing, although a valuable standard for measuring fuel properties, has several major limitations, including the requirement of testably high quantities, considerable expense, and a large amount of time. This paper discusses the development of a general-purpose fuel property tool, using machine learning, whose outcome is to screen molecules for desirable fuel properties. BioCompoundML adopts a general methodology, requiring as input only a list of training compounds (with identifiers and measured values) and a listmore » of testing compounds (with identifiers). For the training data, BioCompoundML collects open data from the National Center for Biotechnology Information, incorporates user-provided features, imputes missing values, performs feature reduction, builds a classifier, and clusters compounds. BioCompoundML then collects data for the testing compounds, predicts class membership, and determines whether compounds are found in the range of variability of the training data set. We demonstrate this tool using three different fuel properties: research octane number (RON), threshold soot index (TSI), and melting point (MP). Here we provide measures of its success with these properties using randomized train/test measurements: average accuracy is 88% in RON, 85% in TSI, and 94% in MP; average precision is 88% in RON, 88% in TSI, and 95% in MP; and average recall is 88% in RON, 82% in TSI, and 97% in MP. The receiver operator characteristics (area under the curve) were estimated at 0.88 in RON, 0.86 in TSI, and 0.87 in MP. We also measured the success of BioCompoundML by sending 16 compounds for direct RON determination. Finally, we provide a screen of 1977 hydrocarbons/oxygenates within the 8696 compounds in MetaCyc, identifying compounds with high predictive strength for high or low RON.« less

  3. Stage 2 tool user’s manual.

    DOT National Transportation Integrated Search

    2017-08-01

    The purpose of the Permitted Overweight Truck Corridor Analysis Tool (referred to in this document as the Stage 2 Tool) is to evaluate existing or to create new proposed overweight (OW) truck corridors to estimate the permitted OW truck, pavement, br...

  4. 7 CFR 227.1 - General purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NUTRITION EDUCATION AND TRAINING PROGRAM General § 227.1 General purpose and scope. The purpose of these regulations is to implement section 19 of the Child Nutrition Act...

  5. 7 CFR 227.1 - General purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NUTRITION EDUCATION AND TRAINING PROGRAM General § 227.1 General purpose and scope. The purpose of these regulations is to implement section 19 of the Child Nutrition Act...

  6. 7 CFR 227.1 - General purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NUTRITION EDUCATION AND TRAINING PROGRAM General § 227.1 General purpose and scope. The purpose of these regulations is to implement section 19 of the Child Nutrition Act...

  7. 7 CFR 227.1 - General purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NUTRITION EDUCATION AND TRAINING PROGRAM General § 227.1 General purpose and scope. The purpose of these regulations is to implement section 19 of the Child Nutrition Act...

  8. 7 CFR 227.1 - General purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NUTRITION EDUCATION AND TRAINING PROGRAM General § 227.1 General purpose and scope. The purpose of these regulations is to implement section 19 of the Child Nutrition Act...

  9. 21 CFR 864.4010 - General purpose reagent.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false General purpose reagent. 864.4010 Section 864.4010 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Specimen Preparation Reagents § 864.4010 General purpose...

  10. 21 CFR 864.4010 - General purpose reagent.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false General purpose reagent. 864.4010 Section 864.4010 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Specimen Preparation Reagents § 864.4010 General purpose...

  11. 21 CFR 864.4010 - General purpose reagent.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false General purpose reagent. 864.4010 Section 864.4010 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Specimen Preparation Reagents § 864.4010 General purpose...

  12. Microscale Obstacle Resolving Air Quality Model Evaluation with the Michelstadt Case

    PubMed Central

    Rakai, Anikó; Kristóf, Gergely

    2013-01-01

    Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7. PMID:24027450

  13. Microscale obstacle resolving air quality model evaluation with the Michelstadt case.

    PubMed

    Rakai, Anikó; Kristóf, Gergely

    2013-01-01

    Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7.

  14. Utilization of Social Media Platforms for Educational Purposes among the Faculty of Higher Education with Special Reference to Tamil Nadu

    ERIC Educational Resources Information Center

    Vivakaran, Mangala Vadivu; Neelamalar, M.

    2018-01-01

    Social media tools are observed to play a vital role in the renovation of the conventional teaching and learning practices across the globe. Though primarily developed for online social communication, social media platforms tend to possess suitable tools that can be used for instructional purposes in order to initiate active learning among…

  15. Washington State water quality temperature standards as related to reactor operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballowe, J.W.

    1968-08-14

    The purpose of this report is to provide a basic working tool for determining the relationship between the allowable temperature increase within the Columbia River reach at the Hanford Site and the actual temperature increase as associated with various reactor operating modes. This basic tool can be utilized for day-to-day operating purposes or for the achievement of historical information.

  16. QR Codes as Mobile Learning Tools for Labor Room Nurses at the San Pablo Colleges Medical Center

    ERIC Educational Resources Information Center

    Del Rosario-Raymundo, Maria Rowena

    2017-01-01

    Purpose: The purpose of this paper is to explore the use of QR codes as mobile learning tools and examine factors that impact on their usefulness, acceptability and feasibility in assisting the nurses' learning. Design/Methodology/Approach: Study participants consisted of 14 regular, full-time, board-certified LR nurses. Over a two-week period,…

  17. Personal Learning Environments Acceptance Model: The Role of Need for Cognition, e-Learning Satisfaction and Students' Perceptions

    ERIC Educational Resources Information Center

    del Barrio-García, Salvador; Arquero, José L.; Romero-Frías, Esteban

    2015-01-01

    As long as students use Web 2.0 tools extensively for social purposes, there is an opportunity to improve students' engagement in Higher Education by using these tools for academic purposes under a Personal Learning Environment approach (PLE 2.0). The success of these attempts depends upon the reactions and acceptance of users towards e-learning…

  18. Advanced mathematical on-line analysis in nuclear experiments. Usage of parallel computing CUDA routines in standard root analysis

    NASA Astrophysics Data System (ADS)

    Grzeszczuk, A.; Kowalski, S.

    2015-04-01

    Compute Unified Device Architecture (CUDA) is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs) for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.

  19. Chemical vapor deposition fluid flow simulation modelling tool

    NASA Technical Reports Server (NTRS)

    Bullister, Edward T.

    1992-01-01

    Accurate numerical simulation of chemical vapor deposition (CVD) processes requires a general purpose computational fluid dynamics package combined with specialized capabilities for high temperature chemistry. In this report, we describe the implementation of these specialized capabilities in the spectral element code NEKTON. The thermal expansion of the gases involved is shown to be accurately approximated by the low Mach number perturbation expansion of the incompressible Navier-Stokes equations. The radiative heat transfer between multiple interacting radiating surfaces is shown to be tractable using the method of Gebhart. The disparate rates of reaction and diffusion in CVD processes are calculated via a point-implicit time integration scheme. We demonstrate the use above capabilities on prototypical CVD applications.

  20. Design and Verification Guidelines for Vibroacoustic and Transient Environments

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Design and verification guidelines for vibroacoustic and transient environments contain many basic methods that are common throughout the aerospace industry. However, there are some significant differences in methodology between NASA/MSFC and others - both government agencies and contractors. The purpose of this document is to provide the general guidelines used by the Component Analysis Branch, ED23, at MSFC, for the application of the vibroacoustic and transient technology to all launch vehicle and payload components and payload components and experiments managed by NASA/MSFC. This document is intended as a tool to be utilized by the MSFC program management and their contractors as a guide for the design and verification of flight hardware.

  1. Omics AnalySIs System for PRecision Oncology (OASISPRO): A Web-based Omics Analysis Tool for Clinical Phenotype Prediction.

    PubMed

    Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael

    2017-09-12

    Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode . © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  2. Your company's history as a leadership tool.

    PubMed

    Seaman, John T; Smith, George David

    2012-12-01

    When the history of an organization comes up, it's usually in connection with an anniversary--just part of the "balloons and fireworks" (as one business leader characterized his company's bicentennial celebration, knowing that the investment of time and money would have little staying power). A fast-changing world leaves little time for nostalgia and irrelevant details--or, worse, strategies for winning the last war. But the authors, business historians at the Winthrop Group, assert that leaders with no patience for history are missing a vital truth: A sophisticated understanding of the past is one of the most powerful tools they have for shaping the future. The job of leaders, most would agree, is to inspire collective efforts and devise smart strategies for the future. History can be profitably employed on both fronts. As a leader strives to get people working together productively, communicating the history of the enterprise can instill a sense of identity and purpose and suggest the goals that will resonate. In its most familiar form, as a narrative about the past, history is a rich explanatory tool with which executives can make a case for change and motivate people to overcome challenges. Taken to a higher level, it also serves as a potent problem-solving tool, one that offers pragmatic insights, valid generalizations, and meaningful perspectives--a way to cut through management fads and the noise of the moment to what really matters.

  3. Knowledge and attitude about computer and internet usage among dental students in Western Rajasthan, India.

    PubMed

    Jali, Pramod K; Singh, Shamsher; Babaji, Prashant; Chaurasia, Vishwajit Rampratap; Somasundaram, P; Lau, Himani

    2014-01-01

    Internet is a useful tool to update the knowledge. The aim of the present study was to assess the current level of knowledge on the computer and internet among under graduate dental students. The study consists of self-administered close ended questionnaire survey. Questionnaires were distributed to undergraduate dental students. The study was conducted during July to September 2012. In the selected samples, response rate was 100%. Most (94.4%) of the students had computer knowledge and 77.4% had their own computer and access at home. Nearly 40.8% of students use computer for general purpose, 28.5% for entertainment and 22.8% used for research purpose. Most of the students had internet knowledge (92.9%) and they used it independently (79.1%). Nearly 42.1% used internet occasionally whereas, 34.4% used regularly, 21.7% rarely and 1.8% don't use respectively. Internet was preferred for getting information (48.8%) due to easy accessibility and recent updates. For dental purpose students used internet 2-3 times/week (45.3%). Most (95.3%) of the students responded to have computer based learning program in the curriculum. Computer knowledge was observed to be good among dental students.

  4. A content analysis of chronic diseases social groups on Facebook and Twitter.

    PubMed

    De la Torre-Díez, Isabel; Díaz-Pernas, Francisco Javier; Antón-Rodríguez, Míriam

    2012-01-01

    Research on the use of social networks for health-related purposes is limited. This study aims to characterize the purpose and use of Facebook and Twitter groups concerning colorectal cancer, breast cancer, and diabetes. We searched in Facebook ( www.facebook.com ) and Twitter ( www.twitter.com ) using the terms "colorectal cancer," "breast cancer," and "diabetes." Each important group has been analyzed by extracting its network name, number of members, interests, and Web site URL. We found 216 breast cancer groups, 171 colorectal cancer groups, and 527 diabetes groups on Facebook and Twitter. The largest percentage of the colorectal cancer groups (25.58%) addresses prevention, similarly to breast cancer, whereas diabetes groups are mainly focused on research issues (25.09%). There are more social groups about breast cancer and diabetes on Facebook (around 82%) than on Twitter (around 18%). Regarding colorectal cancer, the difference is less: Facebook had 62.23%, and Twitter 31.76%. Social networks are a useful tool for supporting patients suffering from these three diseases. Regarding the use of these social networks for disease support purposes, Facebook shows a higher usage rate than Twitter, perhaps because Twitter is newer than Facebook, and its use is not so generalized.

  5. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  6. Falls risk assessment begins with hello: lessons learned from the use of one home health agency's fall risk tool.

    PubMed

    Flemming, Patricia J; Ramsay, Katherine

    2012-10-01

    Identifying older adults at risk for falls is a challenge all home healthcare agencies (HHAs) face. The process of assessing for falls risk begins with the initial home visit. One HHA affiliated with an academic medical center describes its experience in development and use of a Falls Risk Assessment (FRA) tool over a 10-year period. The FRA tool has been modified since initial development to clarify elements of the tool based on research and to reflect changes in the Outcome and Assessment Information Set (OASIS) document. The primary purpose of this article is to share a validated falls risk assessment tool to facilitate identification of fall-related risk factors in the homebound population. A secondary purpose is to share lessons learned by the HHA during the 10 years using the FRA.

  7. Application of a Computerized General Purpose Information Management System (SELGEM) to Medically Important Arthropods (Diptera: Culcidae).

    DTIC Science & Technology

    1980-06-01

    COMPUTERIZED GENERAL PURPOSE INFORMATION MANAGEMENT SYSTEM (SELGE.M) TO KEDICALLY IMPORTANT ARTHROPODS (DIPTERA: CULICIDAE) Annual Report Terry L. Erwin June...GENERAL PURPOSE INFORMATION MANAGEMENT SYSTEM Annual--1 September 1979- (SEIGEM) TO MEDICALLY ThWORTANT ARTHROPODS 30 May 1980 (DIPTERA: CULICIDAE) 6

  8. Communicative Tools and Modes in Thematic Preschool Work

    ERIC Educational Resources Information Center

    Ahlskog-Björkman, Eva; Björklund, Camilla

    2016-01-01

    This study focuses on teachers' ways of mediating meaning through communicative tools and modes in preschool thematic work. A socio-cultural perspective is used for analysis on how tools and modes are provided for children to make use of for communicative purposes. The research questions are: (1) what communicative tools do teachers use in their…

  9. Tool use and mechanical problem solving in apraxia.

    PubMed

    Goldenberg, G; Hagmann, S

    1998-07-01

    Moorlaas (1928) proposed that apraxic patients can identify objects and can remember the purpose they have been made for but do not know the way in which they must be used to achieve that purpose. Knowledge about the use of objects and tools can have two sources: It can be based on retrieval of instructions of use from semantic memory or on a direct inference of function from structure. The ability to infer function from structure enables subjects to use unfamiliar tools and to detect alternative uses of familiar tools. It is the basis of mechanical problem solving. The purpose of the present study was to analyze retrieval of instruction of use, mechanical problem solving, and actual tool use in patients with apraxia due to circumscribed lesions of the left hemisphere. For assessing mechanical problem solving we developed a test of selection and application of novel tools. Access to instruction of use was tested by pantomime of tool use. Actual tool use was examined for the same familiar tools. Forty two patients with left brain damage (LBD) and aphasia, 22 patients with right brain damage (RBD) and 22 controls were examined. Only LBD patients differed from controls on all tests. RBD patients had difficulties with the use but not with the selection of novel tools. In LBD patients there was a significant correlation between pantomime of tool use and novel tool selection but there were single cases who scored in the defective range on one of these tests and normally on the other. Analysis of LBD patients' lesions suggested that frontal lobe damage does not disturb novel tool selection. Only LBD patients who failed on pantomime of object use and on novel tool selection committed errors in actual use of familiar tools. The finding that mechanical problem solving is invariably defective in apraxic patients who commit errors with familiar tools is in good accord with clinical observations, as the gravity of their errors goes beyond what one would expect as a mere sequel of loss of access to instruction of use.

  10. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  11. Purpose-Driven Leadership: Defining, Defending and Sustaining a School's Purpose

    ERIC Educational Resources Information Center

    Holloman, Harold L., Jr.; Rouse, William A., Jr.; Farrington, Vernon

    2007-01-01

    Purpose-driven leadership is a constructive leadership model that challenges an organization to: define its purpose, maintain integrity, encourage character, prevent burnout and sustain vitality. The model incorporates "best practice language" and the tools needed to foster a meaningful discourse. As school leaders strive to define, defend, and…

  12. Psychological morbidity, quality of life, and self-rated health in the military personnel

    PubMed Central

    Chou, Han-Wei; Tzeng, Wen-Chii; Chou, Yu-Ching; Yeh, Hui-Wen; Chang, Hsin-An; Kao, Yu-Cheng; Tzeng, Nian-Sheng

    2014-01-01

    Objective The mental health of military personnel varies as a result of different cultural, political, and administrative factors. The purpose of this study was to evaluate the psychological morbidity and quality of life of military personnel in Taiwan. Materials and methods This cross-sectional study utilized the World Health Organization Quality of Life Instrument, brief version, Taiwan version, the General Health Questionnaire-12, Chinese version, and the Visual Analog Scale (VAS) in several military units. Results More than half of the subjects (55.3%) identified themselves as mentally unhealthy on the General Health Questionnaire-12, Chinese version; however, a higher percentage of officers perceived themselves as healthy (57.4%) than did noncommissioned officers (38.5%) or enlisted men (42.2%). Officers also had higher total quality of life (QOL) scores (83.98) than did enlisted men (79.67). Scores on the VAS also varied: officers: 72.5; noncommissioned officers: 67.7; and enlisted men: 66.3. The VAS and QOL were positively correlated with perceived mental health among these military personnel. Conclusion Our subjects had higher rates of perceiving themselves as mentally unhealthy compared to the general population. Those of higher rank perceived themselves as having better mental health and QOL. Improving mental health could result in a better QOL in the military. The VAS may be a useful tool for the rapid screening of self-reported mental health, which may be suitable in cases of stressful missions, such as in disaster rescue; however, more studies are needed to determine the optimal cut-off point of this measurement tool. PMID:24570587

  13. Concordant parent-child reports of anxiety predict impairment in youth with functional abdominal pain

    PubMed Central

    Cunningham, Natoshia Raishevich; Cohen, Mitchell B.; Farrell, Michael K.; Mezoff, Adam G.; Lynch-Jordan, Anne; Kashikar-Zuck, Susmita

    2014-01-01

    Introduction Functional abdominal pain (FAP) is associated with significant anxiety and impairment. Prior investigations of child anxiety in youth with FAP are generally limited by small sample sizes, based on child report, and use lengthy diagnostic tools. It is unknown 1) if a brief anxiety screening tool is feasible, 2) whether parent and child reports of anxiety are congruent, and 3) whether parent and child agreement of child anxiety corresponds to increased impairment. The purpose of this investigation was to examine anxiety characteristics in youth with FAP using parent and child reports. Parent-child agreement of child anxiety symptoms was examined in relation to pain and disability. Materials and Methods One-hundred patients with FAP (8-18 years of age) recruited from pediatric gastroenterology clinics completed measures of pain intensity (Numeric Rating Scale), and disability (Functional Disability Inventory). Patients and caregivers both completed a measure of child anxiety characteristics (Screen for Child Anxiety and Related Disorders). Results Clinically significant anxiety symptoms were more commonly reported by youth (54%) than their parents (30%). Panic/somatic symptoms, generalized anxiety, and separation anxiety were most commonly endorsed by patients whereas generalized anxiety, separation anxiety, and school avoidance were most commonly reported by parents. The majority (65%) of parents and children agreed on presence (26%) or absence (39%) of clinically significant anxiety. Parent-child agreement of clinically significant anxiety was related to increased impairment. Discussion A brief screening instrument of parent and child reports of anxiety can provide clinically relevant information for comprehensive treatment planning in children with FAP. PMID:25714575

  14. Using the case study teaching method to promote college students' critical thinking skills

    NASA Astrophysics Data System (ADS)

    Terry, David Richard

    2007-12-01

    The purpose of this study was to examine general and domain-specific critical thinking skills in college students, particularly ways in which these skills might be increased through the use of the case study method of teaching. General critical thinking skills were measured using the Watson-Glaser Critical Thinking Appraisal (WGCTA) Short Form, a forty-item paper-and-pencil test designed to measure important abilities involved in critical thinking, including inference, recognition of assumptions, deduction, interpretation, and evaluation of arguments. The ability to identify claims and support those claims with evidence is also an important aspect of critical thinking. I developed a new instrument, the Claim and Evidence Assessment Tool (CEAT), to measure these skills in a domain-specific manner. Forty undergraduate students in a general science course for non-science majors at a small two-year college in the northeastern United States experienced positive changes in general critical thinking according to results obtained using the Watson-Glaser Critical Thinking Appraisal (WGCTA). In addition, the students showed cumulative improvement in their ability to identify claims and evidence, as measured by the Claim and Evidence Assessment Tool (CEAT). Mean score on the WGCTA improved from 22.15 +/- 4.59 to 23.48 +/- 4.24 (out of 40), and the mean CEAT score increased from 14.98 +/- 3.28 to 16.20 +/- 3.08 (out of 24). These increases were modest but statistically and educationally significant. No differences in claim and evidence identification were found between students who learned about specific biology topics using the case study method of instruction and those who were engaged in more traditional instruction, and the students' ability to identify claims and evidence and their factual knowledge showed little if any correlation. The results of this research were inconclusive regarding whether or not the case study teaching method promotes college students' general or domain-specific critical thinking skills, and future research addressing this issue should probably utilize larger sample sizes and a pretest-posttest randomized experimental design.

  15. Decision-making in general practice: the effect of financial incentives on the use of laboratory analyses.

    PubMed

    Munkerud, Siri Fauli

    2012-04-01

    This paper examines the reaction of general practitioners (GPs) to a reform in 2004 in the remuneration system for using laboratory services in general practice. The purpose of this paper is to study whether income motivation exists regarding the use of laboratory services in general practice, and if so, the degree of income motivation among general practitioners (GPs) in Norway. We argue that the degree of income motivation is stronger when the physicians are uncertain about the utility of the laboratory service in question. We have panel data from actual physician-patient encounters in general practices in the years 2001-2004 and use discrete choice analysis and random effects models. Estimation results show that an increase in the fees will lead to a small but significant increase in use. The reform led to minor changes in the use of laboratory analyses in GPs' offices, and we argue that financial incentives were diluted because they were in conflict with medical recommendations and existing medical practice. The patient's age has the most influence and the results support the hypothesis that the impact of income increases with increasing uncertainty about diagnosis and treatment. The policy implication of our results is that financial incentives alone are not an effective tool for influencing the use of laboratory services in GPs' offices.

  16. The Efficiency of the "Learning Management System (LMS)" in AOU, Kuwait, as a Communication Tool in an E-Learning System

    ERIC Educational Resources Information Center

    Alfadly, Ahmad Assaf

    2013-01-01

    Purpose: The integration of a Learning Management System (LMS) at the Arab Open University (AOU), Kuwait, opens new possibilities for online interaction between teachers and students. The purpose of this paper is to evaluate the efficiency of the LMS at AOU, Kuwait as a communication tool in the E-learning system and to find the best automated…

  17. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  18. 41 CFR 109-38.5103 - Motor vehicle utilization standards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... are established for DOE as objectives for those motor vehicles operated generally for those purposes for which acquired: (1) Sedans and station wagons, general purpose use—12,000 miles per year. (2) Light trucks (4×2's) and general purpose vehicles, one ton and under (less than 12,500 GVWR)—10,000...

  19. 41 CFR 109-38.5103 - Motor vehicle utilization standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... are established for DOE as objectives for those motor vehicles operated generally for those purposes for which acquired: (1) Sedans and station wagons, general purpose use—12,000 miles per year. (2) Light trucks (4×2's) and general purpose vehicles, one ton and under (less than 12,500 GVWR)—10,000...

  20. 76 FR 76954 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-09

    ... Bombs, 1000 BLU-117 2000lb General Purpose Bombs, 600 BLU-109 2000lb Hard Target Penetrator Bombs, and four BDU-50C inert bombs, fuzes, weapons integration, munitions trainers, personnel training and... kits, 3300 BLU-111 500lb General Purpose Bombs, 1000 BLU-117 2000lb General Purpose Bombs, 600 BLU-109...

  1. Application of a Computerized General Purpose Information Management System (SELGEM) (SELf-GEnerating Master) to Medically Important Arthropods (Diptera: Culicidae).

    DTIC Science & Technology

    1982-07-01

    GENERAL PURPOSE INFORMATION MANAGEMENT SYSTEM (SELGEM) TO MEDICALLY 0 IMPORTANT ARTHROPODS (DIPTERA: CULICIDAE) oAnnual Report Terry L. Erwin July...APPLICATION OF A COMPUTERIZED GENERAL PURPOSE Annual Report INFORMATION MANAGEMENT SYSTEM (SELGEM) TO July 1981 to June 1982 MEDICALLY IMPORTANT ARTHROPODS

  2. Parallelization and checkpointing of GPU applications through program transformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solano-Quinde, Lizandro Damian

    2012-01-01

    GPUs have emerged as a powerful tool for accelerating general-purpose applications. The availability of programming languages that makes writing general-purpose applications for running on GPUs tractable have consolidated GPUs as an alternative for accelerating general purpose applications. Among the areas that have benefited from GPU acceleration are: signal and image processing, computational fluid dynamics, quantum chemistry, and, in general, the High Performance Computing (HPC) Industry. In order to continue to exploit higher levels of parallelism with GPUs, multi-GPU systems are gaining popularity. In this context, single-GPU applications are parallelized for running in multi-GPU systems. Furthermore, multi-GPU systems help to solvemore » the GPU memory limitation for applications with large application memory footprint. Parallelizing single-GPU applications has been approached by libraries that distribute the workload at runtime, however, they impose execution overhead and are not portable. On the other hand, on traditional CPU systems, parallelization has been approached through application transformation at pre-compile time, which enhances the application to distribute the workload at application level and does not have the issues of library-based approaches. Hence, a parallelization scheme for GPU systems based on application transformation is needed. Like any computing engine of today, reliability is also a concern in GPUs. GPUs are vulnerable to transient and permanent failures. Current checkpoint/restart techniques are not suitable for systems with GPUs. Checkpointing for GPU systems present new and interesting challenges, primarily due to the natural differences imposed by the hardware design, the memory subsystem architecture, the massive number of threads, and the limited amount of synchronization among threads. Therefore, a checkpoint/restart technique suitable for GPU systems is needed. The goal of this work is to exploit higher levels of parallelism and to develop support for application-level fault tolerance in applications using multiple GPUs. Our techniques reduce the burden of enhancing single-GPU applications to support these features. To achieve our goal, this work designs and implements a framework for enhancing a single-GPU OpenCL application through application transformation.« less

  3. Advances in lenticular lens arrays for visual display

    NASA Astrophysics Data System (ADS)

    Johnson, R. Barry; Jacobsen, Gary A.

    2005-08-01

    Lenticular lens arrays are widely used in the printed display industry and in specialized applications of electronic displays. In general, lenticular arrays can create from interlaced printed images such visual effects as 3-D, animation, flips, morph, zoom, or various combinations. The use of these typically cylindrical lens arrays for this purpose began in the late 1920's. The lenses comprise a front surface having a spherical crosssection and a flat rear surface upon where the material to be displayed is proximately located. The principal limitation to the resultant image quality for current technology lenticular lenses is spherical aberration. This limitation causes the lenticular lens arrays to be generally thick (0.5 mm) and not easily wrapped around such items as cans or bottles. The objectives of this research effort were to develop a realistic analytical model, to significantly improve the image quality, to develop the tooling necessary to fabricate lenticular lens array extrusion cylinders, and to develop enhanced fabrication technology for the extrusion cylinder. It was determined that the most viable cross-sectional shape for the lenticular lenses is elliptical. This shape dramatically improves the image quality. The relationship between the lens radius, conic constant, material refractive index, and thickness will be discussed. A significant challenge was to fabricate a diamond-cutting tool having the proper elliptical shape. Both true elliptical and pseudo-elliptical diamond tools were designed and fabricated. The plastic sheets extruded can be quite thin (< 0.25 mm) and, consequently, can be wrapped around cans and the like. Fabrication of the lenticular engraved extrusion cylinder required remarkable development considering the large physical size and weight of the cylinder, and the tight mechanical tolerances associated with the lenticular lens molds cut into the cylinder's surface. The development of the cutting tool and the lenticular engraved extrusion cylinder will be presented in addition to an illustrative comparison of current lenticular technology and the new technology. Three U.S. patents have been issued as a consequence of this research effort.

  4. A multiphysics and multiscale software environment for modeling astrophysical systems

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel

    2009-05-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  5. Status and future of MUSE

    NASA Astrophysics Data System (ADS)

    Harfst, S.; Portegies Zwart, S.; McMillan, S.

    2008-12-01

    We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  6. Effects of soil water content on the external exposure of fauna to radioactive isotopes.

    PubMed

    Beaugelin-Seiller, K

    2016-01-01

    Within a recent model intercomparison about radiological risk assessment for contaminated wetlands, the influence of soil saturation conditions on external dose rates was evidenced. This issue joined concerns of assessors regarding the choice of the soil moisture value to input in radiological assessment tools such as the ERICA Tool. Does it really influence the assessment results and how? This question was investigated under IAEA's Modelling and Data for Radiological Impacts Assessments (MODARIA) programme via 42 scenarios for which the soil water content varied from 0 (dry soil) to 100% (saturated soil), in combination with other parameters that may influence the values of the external dose conversion coefficients (DCCs) calculated for terrestrial organisms exposed in soil. A set of α, β, and γ emitters was selected in order to cover the range of possible emission energies. The values of their external DCCs varied generally within a factor 1 to 1.5 with the soil water content, excepted for β emitters that appeared more sensitive (DCCs within a factor of about 3). This may be of importance for some specific cases or for upper tiers of radiological assessments, when refinement is required. But for the general purpose of screening assessment of radiological impact on fauna and flora, current approaches regarding the soil water content are relevant. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  8. Transit boardings estimation and simulation tool (TBEST) calibration for guideway and BRT modes : [summary].

    DOT National Transportation Integrated Search

    2013-06-01

    As demand for public transportation grows, planning tools used by Florida Department of Transportation (FDOT) and other transit agencies must evolve to effectively predict changing patterns of ridership. A key tool for this purpose is the Transit Boa...

  9. Groundwater quality assessment using geospatial and statistical tools in Salem District, Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Arulbalaji, P.; Gurugnanam, B.

    2017-10-01

    The water quality study of Salem district, Tamil Nadu has been carried out to assess the water quality for domestic and irrigation purposes. For this purpose, 59 groundwater samples were collected and analyzed for pH, electrical conductivity (EC), total dissolved solids (TDS), major anions (HCO3 -, CO3 -, F-, Cl-, NO2 - + NO3 -, and SO4 2-), major cations (Ca2+ Mg2+, Na+, and K+), alkalinity (ALK), and hardness (HAR). To assess the water quality, the following chemical parameters were calculated based on the analytical results, such as Piper plot, water quality index (WQI), sodium adsorption ratio (SAR), magnesium hazard (MH), Kelly index (KI), and residual sodium carbonate (RSC). Wilcox diagram represents that 23% of the samples are excellent to good, 40% of the samples are good to permissible, 10% of the samples are permissible to doubtful, 24% of the samples are doubtful unsuitable, and only 3% of the samples are unsuitable for irrigation. SAR values shows that 52% of the samples indicate high-to-very high and low-to-medium alkali water. KI values indicate good quality (30%) and not suitable (70%) for irrigation purposes. RSC values indicate that 89% of samples are suitable for irrigation purposes. MH reveals that 17% suitable and 83% samples are not suitable for irrigation purposes and for domestic purposes the excellent (8%), good (48%), and poor (44%). The agricultural waste, fertilizer used, soil leaching, urban runoff, livestock waste, and sewages are the sources of poor water quality. Some samples are not suitable for irrigation purposes due to high salinity, hardness, and magnesium concentration. In general, the groundwater of the Salem district was polluted by agricultural activities, anthropogenic activities, ion exchange, and weathering.

  10. Development and psychometric evaluation of the "Neurosurgical Evaluation of Attitudes towards simulation Training" (NEAT) tool for use in neurosurgical education and training.

    PubMed

    Kirkman, Matthew A; Muirhead, William; Nandi, Dipankar; Sevdalis, Nick

    2014-01-01

    Neurosurgical simulation training is becoming increasingly popular. Attitudes toward simulation among residents can contribute to the effectiveness of simulation training, but such attitudes remain poorly explored in neurosurgery with no psychometrically proven measure in the literature. The aim of the present study was to evaluate prospectively a newly developed tool for this purpose: the Neurosurgical Evaluation of Attitudes towards simulation Training (NEAT). The NEAT tool was prospectively developed in 2 stages and psychometrically evaluated (validity and reliability) in 2 administrations with the same participants. The tool comprises a questionnaire with 9 Likert scale items and 2 free-text sections assessing attitudes toward simulation in neurosurgery. The evaluation was completed with 31 neurosurgery residents in London, United Kingdom, who were generally favorable toward neurosurgical simulation. The internal consistency of the questionnaire was high, as demonstrated by the overall Cronbach α values (α=0.899 and α=0.955). All but 2 questionnaire items had "substantial" or "almost perfect" test-retest reliability following repeated survey administrations (median Pearson r correlation=0.688; range, 0.248-0.841). NEAT items were well correlated with each other on both occasions, showing good validity of content within the NEAT tool. There was no significant relationship between either gender or length of neurosurgical experience and item ratings. NEAT is the first psychometrically evaluated tool for evaluating attitudes toward simulation in neurosurgery. Further implementation of NEAT is required in wider neurosurgical populations to establish whether specific population groups differ. Use of NEAT in studies of neurosurgical simulation could offer an additional outcome measure to performance metrics, permitting evaluation of the impact of neurosurgical simulation on attitudes toward simulation both between participants and within the same participants over time. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Just tooling around: Experiences with arctools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuttle, M.A.

    1994-06-01

    The three US Department of Energy (DOE) Installations on the Oak Ridge Reservation (Oak Ridge National Laboratory, Y-12 and K-25) were established during World War II as part of the Manhattan Project to build ``the bomb.`` In later years, the work at these facilities involved nuclear energy research, defense-related activities, and uranium enrichment, resulting in the generation of radioactive material and other toxic by-products. Work is now in progress to identify and clean up the environmental contamination from these and other wastes. Martin Marietta Energy Systems, Inc., which manages the Oak Ridge sites as well as DOE installations at Portsmouth,more » Ohio and Paducah, Kentucky, has been charged with creating and maintaining a comprehensive environmental information system in order to comply with the Federal Facility Agreement (FFA) for the Oak Ridge Reservation and the State of Tennessee Oversight Agreement between the US Department of Energy and the State of Tennessee. As a result, the Oak Ridge Environmental Information System (OREIS) was conceived and is currently being implemented. The tools chosen for the OREIS system are Oracle for the relational database, SAS for data analysis and graphical representation, and Arc/INFO and ArcView for the spatial analysis and display component. Within the broad scope of ESRI`s Arc/Info software, ArcTools was chosen as the graphic user interface for inclusion of Arc/Info into OREIS. The purpose of this paper is to examine in the advantages and disadvantages of incorporating ArcTools for the presentation of Arc/INFO in the OREIS system. The immediate and mid-term development goals of the OREIS system as they relate to ArcTools will be presented. A general discussion of our experiences with the ArcTools product is also included.« less

  12. Research Costs Investigated: A Study Into the Budgets of Dutch Publicly Funded Drug-Related Research.

    PubMed

    van Asselt, Thea; Ramaekers, Bram; Corro Ramos, Isaac; Joore, Manuela; Al, Maiwenn; Lesman-Leegte, Ivonne; Postma, Maarten; Vemer, Pepijn; Feenstra, Talitha

    2018-01-01

    The costs of performing research are an important input in value of information (VOI) analyses but are difficult to assess. The aim of this study was to investigate the costs of research, serving two purposes: (1) estimating research costs for use in VOI analyses; and (2) developing a costing tool to support reviewers of grant proposals in assessing whether the proposed budget is realistic. For granted study proposals from the Netherlands Organization for Health Research and Development (ZonMw), type of study, potential cost drivers, proposed budget, and general characteristics were extracted. Regression analysis was conducted in an attempt to generate a 'predicted budget' for certain combinations of cost drivers, for implementation in the costing tool. Of 133 drug-related research grant proposals, 74 were included for complete data extraction. Because an association between cost drivers and budgets was not confirmed, we could not generate a predicted budget based on regression analysis, but only historic reference budgets given certain study characteristics. The costing tool was designed accordingly, i.e. with given selection criteria the tool returns the range of budgets in comparable studies. This range can be used in VOI analysis to estimate whether the expected net benefit of sampling will be positive to decide upon the net value of future research. The absence of association between study characteristics and budgets may indicate inconsistencies in the budgeting or granting process. Nonetheless, the tool generates useful information on historical budgets, and the option to formally relate VOI to budgets. To our knowledge, this is the first attempt at creating such a tool, which can be complemented with new studies being granted, enlarging the underlying database and keeping estimates up to date.

  13. Tools to overcome potential barriers to chlamydia screening in general practice: Qualitative evaluation of the implementation of a complex intervention.

    PubMed

    Ricketts, Ellie J; Francischetto, Elaine O'Connell; Wallace, Louise M; Hogan, Angela; McNulty, Cliodna A M

    2016-03-22

    Chlamydia trachomatis remains a significant public health problem. We used a complex intervention, with general practice staff, consisting of practice based workshops, posters, computer prompts and testing feedback and feedback to increase routine chlamydia screening tests in under 25 year olds in South West England. We aimed to evaluate how intervention components were received by staff and to understand what determined their implementation into ongoing practice. We used face-to-face and telephone individual interviews with 29 general practice staff analysed thematically within a Normalisation Process Theory Framework which explores: 1. Coherence (if participants understand the purpose of the intervention); 2. Cognitive participation (engagement with and implementation of the intervention); 3. Collective action (work actually undertaken that drives the intervention forwards); 4. Reflexive monitoring (assessment of the impact of the intervention). Our results showed coherence as all staff including receptionists understood the purpose of the training was to make them aware of the value of chlamydia screening tests and how to increase this in their general practice. The training was described by nearly all staff as being of high quality and responsible for creating a shared understanding between staff of how to undertake routine chlamydia screening. Cognitive participation in many general practice staff teams was demonstrated through their engagement by meeting after the training to discuss implementation, which confirmed the role of each staff member and the use of materials. However several participants still felt unable to discuss chlamydia in many consultations or described sexual health as low priority among colleagues. National targets were considered so high for some general practice staff that they didn't engage with the screening intervention. Collective action work undertaken to drive the intervention included use of computer prompts which helped staff remember to make the offer, testing rate feedback and having a designated lead. Ensuring patients collected samples when still in the general practice was not attained in most general practices. Reflexive monitoring showed positive feedback from patients and other staff about the value of screening, and feedback about the general practices testing rates helped sustain activity. A complex intervention including interactive workshops, materials to help implementation and feedback can help chlamydia screening testing increase in general practices.

  14. 77 FR 9899 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-21

    ... Medium Range Air-to-Air Missiles, 42 GBU-49 Enhanced PAVEWAY II 500 lb Bombs, 200 GBU-54 (2000 lb) Laser Joint Direct Attack Munitions (JDAM) Bombs, 642 BLU-111 (500 lb) General Purpose Bombs, 127 MK-82 (500 lb) General Purpose Bombs, 80 BLU-117 (2000 lb) General Purpose Bombs, 4 MK-84 (2000 lb) Inert...

  15. 78 FR 39947 - To Modify Duty-Free Treatment Under the Generalized System of Preferences and for Other Purposes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... Treatment Under the Generalized System of Preferences and for Other Purposes #0; #0; #0; Presidential... Modify Duty-Free Treatment Under the Generalized System of Preferences and for Other Purposes By the... competitive need limitations on the preferential treatment afforded under the GSP to eligible articles. 4...

  16. Impact of the US Patent System on the promise of personalized medicine.

    PubMed

    Solomon, Louis M; Sieczkiewicz, Gregory J

    2007-09-01

    The biotechnology revolution promises unfathomable future scientific discovery. One of the potential benefits is the accelerated introduction of new diagnostics and treatments to the general public. The right medication for the right patient is the goal of personalized medicine, which directly benefits from many of biotechnology's biggest and most recent advances. The US patent system rewards innovation in medicine and other arts and sciences by granting innovators, for a period of time, the right to exclude others from using what was invented. One of the purposes of the patent system is to trade that right to exclude, and in its stead obtain the patent holder's obligation to fully and publicly disclose the essence of the innovations so that they can be improved, thus advancing the common welfare. A tension exists between personalized medicine's need for access to and use of scientific advances and the patent system's reward of exclusive use or nonuse to innovators. This tension may result in fewer diagnostic and therapeutic tools brought to the market and generally adopted. The risk seems particularly acute with respect to the diagnostic and therapeutic tools arising from genetic testing that hold specific value for a subset of the population. The judicial system has introduced ethical exceptions that overcome a patent holder's right to exclude; these judicial overrides relate to the provision of certain types of medical procedures and the development of certain types of new drugs, and not, apparently, to the use of diagnostic and therapeutic tools essential to the success of personalized medicine. A serious question exists as to whether legislative action is necessary to increase public access to genetic testing.

  17. 23 CFR 1.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Purpose. 1.1 Section 1.1 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION GENERAL MANAGEMENT AND ADMINISTRATION GENERAL § 1.1 Purpose. The purpose of the regulations in this part is to implement and carry out the provisions of Federal law...

  18. 23 CFR 1.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Purpose. 1.1 Section 1.1 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION GENERAL MANAGEMENT AND ADMINISTRATION GENERAL § 1.1 Purpose. The purpose of the regulations in this part is to implement and carry out the provisions of Federal law...

  19. 23 CFR 1.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Purpose. 1.1 Section 1.1 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION GENERAL MANAGEMENT AND ADMINISTRATION GENERAL § 1.1 Purpose. The purpose of the regulations in this part is to implement and carry out the provisions of Federal law...

  20. 23 CFR 1.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Purpose. 1.1 Section 1.1 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION GENERAL MANAGEMENT AND ADMINISTRATION GENERAL § 1.1 Purpose. The purpose of the regulations in this part is to implement and carry out the provisions of Federal law...

  1. 10 CFR 1002.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Purpose. 1002.1 Section 1002.1 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) OFFICIAL SEAL AND DISTINGUISHING FLAG General § 1002.1 Purpose. The purpose of this part is to describe the official seal and distinguishing flag of the Department of Energy, and to...

  2. 23 CFR 1.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Purpose. 1.1 Section 1.1 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION GENERAL MANAGEMENT AND ADMINISTRATION GENERAL § 1.1 Purpose. The purpose of the regulations in this part is to implement and carry out the provisions of Federal law...

  3. EDCATS: An Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Heard, Pamala D.

    1998-01-01

    The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.

  4. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollingsworth, Jeff

    2014-07-31

    The purpose of this project was to develop tools and techniques to improve the ability of computational scientists to investigate and correct problems (bugs) in their programs. Specifically, the University of Maryland component of this project focused on the problems associated with the finite number of bits available in a computer to represent numeric values. In large scale scientific computation, numbers are frequently added to and multiplied with each other billions of times. Thus even small errors due to the representation of numbers can accumulate into big errors. However, using too many bits to represent a number results in additionalmore » computation, memory, and energy costs. Thus it is critical to find the right size for numbers. This project focused on several aspects of this general problem. First, we developed a tool to look for cancelations, the catastrophic loss of precision in numbers due to the addition of two numbers whose actual values are close to each other, but whose representation in a computer is identical or nearly so. Second, we developed a suite of tools to allow programmers to identify exactly how much precision is required for each operation in their program. This tool allows programmers to both verify that enough precision is available, but more importantly find cases where extra precision could be eliminated to allow the program to use less memory, computer time, or energy. These tools use advanced binary modification techniques to allow the analysis of actual optimized code. The system, called Craft, has been applied to a number of benchmarks and real applications.« less

  5. General purpose force doctrine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weltman, J.J.

    In contemporary American strategic parlance, the general purpose forces have come to mean those forces intended for conflict situations other than nuclear war with the Soviet Union. As with all military forces, the general purpose forces are powerfully determined by prevailing conceptions of the problems they must meet and by institutional biases as to the proper way to deal with those problems. This paper deals with the strategic problems these forces are intended to meet, the various and often conflicting doctrines and organizational structures which have been generated in order to meet those problems, and the factors which will influencemore » general purpose doctrine and structure in the future. This paper does not attempt to prescribe technological solutions to the needs of the general purpose forces. Rather, it attempts to display the doctrinal and institutional context within which new technologies must operate, and which will largely determine whether these technologies are accepted into the force structure or not.« less

  6. Multi-purpose tool mitten

    NASA Technical Reports Server (NTRS)

    Wilcomb, E. F.

    1969-01-01

    Tool mitten provides a low reaction torque source of power for wrench, screwdriver, or drill activities. The technique employed prevents the attachments from drifting away from the operator. While the tools are specifically designed for space environments, they can be used on steel scaffolding, in high building maintenance, or underwater environments.

  7. Individual and social learning processes involved in the acquisition and generalization of tool use in macaques

    PubMed Central

    Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.

    2012-01-01

    Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424

  8. MCNP capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less

  9. The Cyber Aggression in Relationships Scale: A New Multidimensional Measure of Technology-Based Intimate Partner Aggression.

    PubMed

    Watkins, Laura E; Maldonado, Rosalita C; DiLillo, David

    2018-07-01

    The purpose of this study was to develop and provide initial validation for a measure of adult cyber intimate partner aggression (IPA): the Cyber Aggression in Relationships Scale (CARS). Drawing on recent conceptual models of cyber IPA, items from previous research exploring general cyber aggression and cyber IPA were modified and new items were generated for inclusion in the CARS. Two samples of adults 18 years or older were recruited online. We used item factor analysis to test the factor structure, model fit, and invariance of the measure structure across women and men. Results confirmed that three-factor models for both perpetration and victimization demonstrated good model fit, and that, in general, the CARS measures partner cyber aggression similarly for women and men. The CARS also demonstrated validity through significant associations with in-person IPA, trait anger, and jealousy. Findings suggest the CARS is a useful tool for assessing cyber IPA in both research and clinical settings.

  10. Human error analysis of commercial aviation accidents: application of the Human Factors Analysis and Classification system (HFACS).

    PubMed

    Wiegmann, D A; Shappell, S A

    2001-11-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.

  11. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  12. GPU-BSM: A GPU-Based Tool to Map Bisulfite-Treated Reads

    PubMed Central

    Manconi, Andrea; Orro, Alessandro; Manca, Emanuele; Armano, Giuliano; Milanesi, Luciano

    2014-01-01

    Cytosine DNA methylation is an epigenetic mark implicated in several biological processes. Bisulfite treatment of DNA is acknowledged as the gold standard technique to study methylation. This technique introduces changes in the genomic DNA by converting cytosines to uracils while 5-methylcytosines remain nonreactive. During PCR amplification 5-methylcytosines are amplified as cytosine, whereas uracils and thymines as thymine. To detect the methylation levels, reads treated with the bisulfite must be aligned against a reference genome. Mapping these reads to a reference genome represents a significant computational challenge mainly due to the increased search space and the loss of information introduced by the treatment. To deal with this computational challenge we devised GPU-BSM, a tool based on modern Graphics Processing Units. Graphics Processing Units are hardware accelerators that are increasingly being used successfully to accelerate general-purpose scientific applications. GPU-BSM is a tool able to map bisulfite-treated reads from whole genome bisulfite sequencing and reduced representation bisulfite sequencing, and to estimate methylation levels, with the goal of detecting methylation. Due to the massive parallelization obtained by exploiting graphics cards, GPU-BSM aligns bisulfite-treated reads faster than other cutting-edge solutions, while outperforming most of them in terms of unique mapped reads. PMID:24842718

  13. Fuzzy logic system able to detect interesting areas of a video sequence

    NASA Astrophysics Data System (ADS)

    De Vleeschouwer, Christophe; Marichal, Xavier; Delmot, Thierry; Macq, Benoit M. M.

    1997-06-01

    This paper introduces an automatic tool able to analyze the picture according to the semantic interest an observer attributes to its content. Its aim is to give a 'level of interest' to the distinct areas of the picture extracted by any segmentation tool. For the purpose of dealing with semantic interpretation of images, a single criterion is clearly insufficient because the human brain, due to its a priori knowledge and its huge memory of real-world concrete scenes, combines different subjective criteria in order to assess its final decision. The developed method permits such combination through a model using assumptions to express some general subjective criteria. Fuzzy logic enables the user to encode knowledge in a form that is very close the way experts think about the decision process. This fuzzy modeling is also well suited to represent multiple collaborating or even conflicting experts opinions. Actually, the assumptions are verified through a non-hierarchical strategy that considers them in a random order, each partial result contributing to the final one. Presented results prove that the tool is effective for a wide range of natural pictures. It is versatile and flexible in that it can be used stand-alone or can take into account any a priori knowledge about the scene.

  14. Students' Problem Solving as Mediated by Their Cognitive Tool Use: A Study of Tool Use Patterns

    ERIC Educational Resources Information Center

    Liu, M.; Horton, L. R.; Corliss, S. B.; Svinicki, M. D.; Bogard, T.; Kim, J.; Chang, M.

    2009-01-01

    The purpose of this study was to use multiple data sources, both objective and subjective, to capture students' thinking processes as they were engaged in problem solving, examine the cognitive tool use patterns, and understand what tools were used and why they were used. The findings of this study confirmed previous research and provided clear…

  15. The Effects of Tools of the Mind on Math and Reading Scores in Kindergarten

    ERIC Educational Resources Information Center

    Mackay, Patricia E.

    2013-01-01

    Although a limited body of research has supported the positive impact of the Tools of the Mind curriculum on the development of self-regulation, research supporting a direct relationship between Tools and academic achievement is extremely limited. The purpose of this study is to evaluate the effectiveness of the Tools of the Mind curriculum…

  16. "Development Radar": The Co-Configuration of a Tool in a Learning Network

    ERIC Educational Resources Information Center

    Toiviainen, Hanna; Kerosuo, Hannele; Syrjala, Tuula

    2009-01-01

    Purpose: The paper aims to argue that new tools are needed for operating, developing and learning in work-life networks where academic and practice knowledge are intertwined in multiple levels of and in boundary-crossing across activities. At best, tools for learning are designed in a process of co-configuration, as the analysis of one tool,…

  17. Assumption-aware tools and agency; an interrogation of the primary artifacts of the program evaluation and design profession in working with complex evaluands and complex contexts.

    PubMed

    Morrow, Nathan; Nkwake, Apollo M

    2016-12-01

    Like artisans in a professional guild, we evaluators create tools to suit our ever evolving practice. The tools we use as evaluators are the primary artifacts of our profession, reflect our practice and embody an amalgamation of paradigms and assumptions. With the increasing shifts in evaluation purposes from judging program worth to understanding how programs work, the evaluator's role is changing to that of facilitating stakeholders in a learning process. This involves clarifying purposes and choices, as well as unearthing critical assumptions. In such a role, evaluators become major tool-users and begin to innovate with small refinements or produce completely new tools to fit a specific challenge or context. We interrogate the form and function of 12 tools used by evaluators when working with complex evaluands and complex contexts. The form is described in terms of traditional qualitative techniques and particular characteristics of the elements, use and presentation of each tool. Then the function of each tool is analyzed with respect to articulating assumptions and affecting the agency of evaluators and stakeholders in complex contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. What we know about the purpose, theoretical foundation, scope and dimensionality of existing self-management measurement tools: A scoping review.

    PubMed

    Packer, Tanya L; Fracini, America; Audulv, Åsa; Alizadeh, Neda; van Gaal, Betsie G I; Warner, Grace; Kephart, George

    2018-04-01

    To identify self-report, self-management measures for adults with chronic conditions, and describe their purpose, theoretical foundation, dimensionality (multi versus uni), and scope (generic versus condition specific). A search of four databases (8479 articles) resulted in a scoping review of 28 self-management measures. Although authors identified tools as measures of self-management, wide variation in constructs measured, purpose, and theoretical foundations existed. Subscales on 13 multidimensional tools collectively measure domains of self-management relevant to clients, however no one tool's subscales cover all domains. Viewing self-management as a complex, multidimensional whole, demonstrated that existing measures assess different, related aspects of self-management. Activities and social roles, though important to patients, are rarely measured. Measures with capacity to quantify and distinguish aspects of self-management may promote tailored patient care. In selecting tools for research or assessment, the reason for development, definitions, and theories underpinning the measure should be scrutinized. Our ability to measure self-management must be rigorously mapped to provide comprehensive and system-wide care for clients with chronic conditions. Viewing self-management as a complex whole will help practitioners to understand the patient perspective and their contribution in supporting each individual patient. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Online plot services for paleomagnetism and rock magnetism

    NASA Astrophysics Data System (ADS)

    Hatakeyama, T.

    2017-12-01

    In paleomagnetism and rock magnetism, a lot of types of original plots are used for obtained data from measurements. Many researchers in paleomagnetism often use not only general-purpose plotting programs such as Microsoft Excel but also single-purpose tools. A large benefit of using the latter tools is that we can make a beautiful figure for our own data. However, those programs require specific environment for their operation such as type of hardware and platform, type of operation system and its version, libraries for execution and so on. Therefore, it is difficult to share the result and graphics among the collaborators who use different environments on their PCs. Thus, one of the best solution is likely a program operated on popular environment. The most popular is web environment as we all know. Almost all current operating systems have web browsers as standard and all people use them regularly. Now we provide a web-based service plotting paleomagnetic results easily.We develop original programs with a command-line user interface (non-GUI), and we prepared web pages for input of the simple measured data and options and a wrapper script which transfers the entered values to the program. The results, analyzed values and plotted graphs from the program are shown in the HTML page and downloadable. Our plot services are provided in http://mage-p.org/mageplot/. In this talk, we introduce our program and service and discuss the philosophy and efficiency of these services.

  20. On line dissemination of environmental knowledge for educational purpose

    NASA Astrophysics Data System (ADS)

    Fant, S.; Macaluso, L.; Marani, A.; Scalvini, G.; Zane, O.

    2003-04-01

    The environment is a laboratory native for learning, always open and everywhere available. Environmental sciences collect knowledge into the environment and make it scientifically transferable through algorithms. Therefore the environment can be used as a gym for teaching with direct experiments many knowledge of many disciplines. For this reason into the environmental database for the Lagoon of Venice carried out at the Istituto Veneto di Scienze, Lettere ed Arti of Venice (Italy) has been organized a section for disseminating environmental knowledge (URL: www.istitutoveneto.it/venezia/divulgazione/divulgazione.htm). In this section are given information and tools to make data base contents understanding easy and to promote its use for didactical purpose. The aim is to stimulate users' curiosity and to satisfy the request of dissemination tools coming from those who work in training field. This section is divided into four chapters: Descriptions, with generalities about the types and the dynamics of environment that can be found in the Venice lagoon; Cards, with specific information about objects, phenomena and categories; Didactics, offering Training Experiences, Educational Courses and Games and Simulations. A Glossary, with technical terms and idiomatic forms completes the section. Secondary school teachers have been involved, in order to understand their requirements and their experience level for deciding contents organization. Moreover, agreement with teachers has been found on a report model allowing some standardization on cataloguing of the environmental didactics and education activities carried out within the schools.

  1. I-Maculaweb: A Tool to Support Data Reuse in Ophthalmology

    PubMed Central

    Bonetto, Monica; Nicolò, Massimo; Gazzarata, Roberta; Fraccaro, Paolo; Rosa, Raffaella; Musetti, Donatella; Musolino, Maria; Traverso, Carlo E.

    2016-01-01

    This paper intends to present a Web-based application to collect and manage clinical data and clinical trials together in a unique tool. I-maculaweb is a user-friendly Web-application designed to manage, share, and analyze clinical data from patients affected by degenerative and vascular diseases of the macula. The unique and innovative scientific and technological elements of this project are the integration with individual and population data, relevant for degenerative and vascular diseases of the macula. Clinical records can also be extracted for statistical purposes and used for clinical decision support systems. I-maculaweb is based on an existing multilevel and multiscale data management model, which includes general principles that are suitable for several different clinical domains. The database structure has been specifically built to respect laterality, a key aspect in ophthalmology. Users can add and manage patient records, follow-up visits, treatment, diagnoses, and clinical history. There are two different modalities to extract records: one for the patient’s own center, in which personal details are shown and the other for statistical purposes, where all center’s anonymized data are visible. The Web-platform allows effective management, sharing, and reuse of information within primary care and clinical research. Clear and precise clinical data will improve understanding of real-life management of degenerative and vascular diseases of the macula as well as increasing precise epidemiologic and statistical data. Furthermore, this Web-based application can be easily employed as an electronic clinical research file in clinical studies. PMID:27170913

  2. TAXONOMY OF MEDICAL DEVICES IN THE LOGIC OF HEALTH TECHNOLOGY ASSESSMENT.

    PubMed

    Henschke, Cornelia; Panteli, Dimitra; Perleth, Matthias; Busse, Reinhard

    2015-01-01

    The suitability of general HTA methodology for medical devices is gaining interest as a topic of scientific discourse. Given the broad range of medical devices, there might be differences between groups of devices that impact both the necessity and the methods of their assessment. Our aim is to develop a taxonomy that provides researchers and policy makers with an orientation tool on how to approach the assessment of different types of medical devices. Several classifications for medical devices based on varying rationales for different regulatory and reporting purposes were analyzed in detail to develop a comprehensive taxonomic model. The taxonomy is based on relevant aspects of existing classification schemes incorporating elements of risk and functionality. Its 9 × 6 matrix distinguishes between the diagnostic or therapeutic nature of devices and considers whether the medical device is directly used by patients, constitutes part of a specific procedure, or can be used for a variety of procedures. We considered the relevance of different device categories in regard to HTA to be considerably variable, ranging from high to low. Existing medical device classifications cannot be used for HTA as they are based on different underlying logics. The developed taxonomy combines different device classification schemes used for different purposes. It aims at providing decision makers with a tool enabling them to consider device characteristics in detail across more than one dimension. The placement of device groups in the matrix can provide decision support on the necessity of conducting a full HTA.

  3. 17 CFR 147.1 - General policy considerations, purpose and scope of rules relating to open Commission meetings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false General policy considerations, purpose and scope of rules relating to open Commission meetings. 147.1 Section 147.1 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION OPEN COMMISSION MEETINGS § 147.1 General policy considerations, purpose and scope of rules...

  4. Commercially available gaming systems as clinical assessment tools to improve value in the orthopaedic setting: a systematic review.

    PubMed

    Ruff, Jessica; Wang, Tiffany L; Quatman-Yates, Catherine C; Phieffer, Laura S; Quatman, Carmen E

    2015-02-01

    Commercially available gaming systems (CAGS) such as the Wii Balance Board (WBB) and Microsoft Xbox with Kinect (Xbox Kinect) are increasingly used as balance training and rehabilitation tools. The purpose of this review was to answer the question, "Are commercially available gaming systems valid and reliable instruments for use as clinical diagnostic and functional assessment tools in orthopaedic settings?" and provide a summary of relevant studies, identify their strengths and weaknesses, and generate conclusions regarding general validity/reliability of WBB and Xbox Kinect in orthopaedics. A systematic search was performed using MEDLINE (1996-2013) and Scopus (1996-2013). Inclusion criteria were minimum of 5 subjects, full manuscript provided in English or translated, and studies incorporating investigation of CAG measurement properties. Exclusion criteria included reviews, systematic reviews, summary/clinical commentaries, or case studies; conference proceedings/presentations; cadaveric studies; studies of non-reversible, non-orthopaedic-related musculoskeletal disease; non-human trials; and therapeutic studies not reporting comparative evaluation to already established functional assessment criteria. All studies meeting inclusion and exclusion criteria were appraised for quality by two independent reviewers. Evidence levels (I-V) were assigned to each study based on established methodological criteria. 3 Level II, 7 level III, and 1 Level IV studies met inclusion criteria and provided information related to the use of the WBB and Xbox Kinect as clinical assessment tools in the field of orthopaedics. Studies have used the WBB in a variety of clinical applications, including the measurement of center of pressure (COP), measurement of medial-to-lateral (M/L) or anterior-to-posterior (A/P) symmetry, assessment anatomic landmark positioning, and assessment of fall risk. However, no uniform protocols or outcomes were used to evaluate the quality of the WBB as a clinical assessment tool; therefore a wide range of sensitivities, specificities, accuracies, and validities were reported. Currently it is not possible to make a universal generalization about the clinical utility of CAGS in the field of orthopaedics. However, there is evidence to support using the WBB and the Xbox Kinect as tools to obtain reliable and valid COP measurements. The Wii Fit Game may specifically provide reliable and valid measurements for predicting fall risk. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Appendix W. Cost Analysis in Teacher Education Programs.

    ERIC Educational Resources Information Center

    Sell, G. Roger; And Others

    This paper is an introduction to the basic cost-related tools available to management for planning, evaluating, and organizing resources for the purpose of achieving objectives within a teacher education preparation program. Three tools are presented in separate sections. Part I on the cost accounting tool for identifying, categorizing, and…

  6. Signal Detection Theory as a Tool for Successful Student Selection

    ERIC Educational Resources Information Center

    van Ooijen-van der Linden, Linda; van der Smagt, Maarten J.; Woertman, Liesbeth; te Pas, Susan F.

    2017-01-01

    Prediction accuracy of academic achievement for admission purposes requires adequate "sensitivity" and "specificity" of admission tools, yet the available information on the validity and predictive power of admission tools is largely based on studies using correlational and regression statistics. The goal of this study was to…

  7. U.S. CASE STUDIES USING MUNICIPAL SOLID WASTE DECISION SUPPORT TOOL

    EPA Science Inventory

    The paper provides an overview of some case studies using the recently completed muniicpal solid waste decision support tool (MSW-DST) in communities across the U.S. The purpose of the overview is to help illustrate the variety of potential applications of the tool. The methodolo...

  8. An Analysis of Teacher Selection Tools in Pennsylvania

    ERIC Educational Resources Information Center

    Vitale, Tracy L.

    2009-01-01

    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  9. Teaching Bioinformatics and Neuroinformatics by Using Free Web-Based Tools

    ERIC Educational Resources Information Center

    Grisham, William; Schottler, Natalie A.; Valli-Marill, Joanne; Beck, Lisa; Beatty, Jackson

    2010-01-01

    This completely computer-based module's purpose is to introduce students to bioinformatics resources. We present an easy-to-adopt module that weaves together several important bioinformatic tools so students can grasp how these tools are used in answering research questions. Students integrate information gathered from websites dealing with…

  10. Development of the EMAP tool facilitating existential communication between general practitioners and cancer patients.

    PubMed

    Assing Hvidt, Elisabeth; Hansen, Dorte Gilså; Ammentorp, Jette; Bjerrum, Lars; Cold, Søren; Gulbrandsen, Pål; Olesen, Frede; Pedersen, Susanne S; Søndergaard, Jens; Timmermann, Connie; Timm, Helle; Hvidt, Niels Christian

    2017-12-01

    General practice recognizes the existential dimension as an integral part of multidimensional patient care alongside the physical, psychological and social dimensions. However, general practitioners (GPs) report substantial barriers related to communication with patients about existential concerns. To describe the development of the EMAP tool facilitating communication about existential problems and resources between GPs and patients with cancer. A mixed-methods design was chosen comprising a literature search, focus group interviews with GPs and patients (n = 55) and a two-round Delphi procedure initiated by an expert meeting with 14 experts from Denmark and Norway. The development procedure resulted in a semi-structured tool containing suggestions for 10 main questions and 13 sub-questions grouped into four themes covering the existential dimension. The tool utilized the acronym and mnemonic EMAP (existential communication in general practice) indicating the intention of the tool: to provide a map of possible existential problems and resources that the GP and the patient can discuss to find points of reorientation in the patient's situation. This study resulted in a question tool that can serve as inspiration and help GPs when communicating with cancer patients about existential problems and resources. This tool may qualify GPs' assessment of existential distress, increase the patient's existential well-being and help deepen the GP-patient relationship.

  11. 40 CFR 72.1 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... REGULATION Acid Rain Program General Provisions § 72.1 Purpose and scope. (a) Purpose. The purpose of this... affected sources and affected units under the Acid Rain Program, pursuant to title IV of the Clean Air Act... regulations under this part set forth certain generally applicable provisions under the Acid Rain Program. The...

  12. 40 CFR 72.1 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... REGULATION Acid Rain Program General Provisions § 72.1 Purpose and scope. (a) Purpose. The purpose of this... affected sources and affected units under the Acid Rain Program, pursuant to title IV of the Clean Air Act... regulations under this part set forth certain generally applicable provisions under the Acid Rain Program. The...

  13. 40 CFR 72.1 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... REGULATION Acid Rain Program General Provisions § 72.1 Purpose and scope. (a) Purpose. The purpose of this... affected sources and affected units under the Acid Rain Program, pursuant to title IV of the Clean Air Act... regulations under this part set forth certain generally applicable provisions under the Acid Rain Program. The...

  14. 40 CFR 72.1 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... REGULATION Acid Rain Program General Provisions § 72.1 Purpose and scope. (a) Purpose. The purpose of this... affected sources and affected units under the Acid Rain Program, pursuant to title IV of the Clean Air Act... regulations under this part set forth certain generally applicable provisions under the Acid Rain Program. The...

  15. 40 CFR 72.1 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... REGULATION Acid Rain Program General Provisions § 72.1 Purpose and scope. (a) Purpose. The purpose of this... affected sources and affected units under the Acid Rain Program, pursuant to title IV of the Clean Air Act... regulations under this part set forth certain generally applicable provisions under the Acid Rain Program. The...

  16. 34 CFR 303.1 - Purpose of the early intervention program for infants and toddlers with disabilities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Purpose of the early intervention program for infants... EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND TODDLERS WITH DISABILITIES General Purpose, Eligibility, and Other General Provisions § 303.1 Purpose of the early intervention program for infants and...

  17. 34 CFR 303.1 - Purpose of the early intervention program for infants and toddlers with disabilities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 2 2011-07-01 2010-07-01 true Purpose of the early intervention program for infants... EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND TODDLERS WITH DISABILITIES General Purpose, Eligibility, and Other General Provisions § 303.1 Purpose of the early intervention program for infants and...

  18. Particle decay of proton-unbound levels in N 12

    DOE PAGES

    Chipps, K. A.; Pain, S. D.; Greife, U.; ...

    2017-04-24

    Transfer reactions are a useful tool for studying nuclear structure, particularly in the regime of low level densities and strong single-particle strengths. Additionally, transfer reactions can populate levels above particle decay thresholds, allowing for the possibility of studying the subsequent decays and furthering our understanding of the nuclei being probed. In particular, the decay of loosely bound nuclei such as 12 N can help inform and improve structure models.The purpose of this paper is to learn about the decay of excited states in 12 N , to more generally inform nuclear structure models, particularly in the case of particle-unbound levelsmore » in low-mass systems which are within the reach of state-of-the-art ab initio calculations.« less

  19. The Ames-Lockheed orbiter processing scheduling system

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Gargan, Robert

    1991-01-01

    A general purpose scheduling system and its application to Space Shuttle Orbiter Processing at the Kennedy Space Center (KSC) are described. Orbiter processing entails all the inspection, testing, repair, and maintenance necessary to prepare the Shuttle for launch and takes place within the Orbiter Processing Facility (OPF) at KSC, the Vehicle Assembly Building (VAB), and on the launch pad. The problems are extremely combinatoric in that there are thousands of tasks, resources, and other temporal considerations that must be coordinated. Researchers are building a scheduling tool that they hope will be an integral part of automating the planning and scheduling process at KSC. The scheduling engine is domain independent and is also being applied to Space Shuttle cargo processing problems as well as wind tunnel scheduling problems.

  20. IntegromeDB: an integrated system and biological search engine.

    PubMed

    Baitaluk, Michael; Kozhenkov, Sergey; Dubinina, Yulia; Ponomarenko, Julia

    2012-01-19

    With the growth of biological data in volume and heterogeneity, web search engines become key tools for researchers. However, general-purpose search engines are not specialized for the search of biological data. Here, we present an approach at developing a biological web search engine based on the Semantic Web technologies and demonstrate its implementation for retrieving gene- and protein-centered knowledge. The engine is available at http://www.integromedb.org. The IntegromeDB search engine allows scanning data on gene regulation, gene expression, protein-protein interactions, pathways, metagenomics, mutations, diseases, and other gene- and protein-related data that are automatically retrieved from publicly available databases and web pages using biological ontologies. To perfect the resource design and usability, we welcome and encourage community feedback.

  1. Integrated 3-D vision system for autonomous vehicles

    NASA Astrophysics Data System (ADS)

    Hou, Kun M.; Shawky, Mohamed; Tu, Xiaowei

    1992-03-01

    Nowadays, autonomous vehicles have become a multidiscipline field. Its evolution is taking advantage of the recent technological progress in computer architectures. As the development tools became more sophisticated, the trend is being more specialized, or even dedicated architectures. In this paper, we will focus our interest on a parallel vision subsystem integrated in the overall system architecture. The system modules work in parallel, communicating through a hierarchical blackboard, an extension of the 'tuple space' from LINDA concepts, where they may exchange data or synchronization messages. The general purpose processing elements are of different skills, built around 40 MHz i860 Intel RISC processors for high level processing and pipelined systolic array processors based on PLAs or FPGAs for low-level processing.

  2. Psychopathic characters in fiction.

    PubMed

    Piechowski-Jozwiak, Bartlomiej; Bogousslavsky, Julien

    2013-01-01

    The theme of psychopathy has fascinated artists and the general public for centuries. The first concepts on psychopathy came from the parasciences, such as phrenology where anatomical features were linked to certain psychopathic/immoral behaviors. The concept of psychopathy was recognized by forensic psychiatry a few decades ago and this official recognition was followed by the emergence of scientific and clinical guidelines for the diagnosis and prognosis of psychopaths. These modern tools can also be used for historical purposes by allowing us to look back on fictional works and identify psychopaths in literature. Interpretation of fictitious psychopaths needs to be related to the historical situation in which the novels were written; such investigations can be both enriching and thrilling. Copyright © 2013 S. Karger AG, Basel.

  3. Residential Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, Michael R; Abdelaziz, Omar A; Jackson, Rogerick K

    Residential Simulation Tool was developed to understand the impact of residential load consumption on utilities including the role of demand response. This is complicated as many different residential loads exist and are utilized for different purposes. The tool models human behavior and contributes this to load utilization, which contributes to the electrical consumption prediction by the tool. The tool integrates a number of different databases from Department of Energy and other Government websites to support the load consumption prediction.

  4. Quasipolynomial generalization of Lotka-Volterra mappings

    NASA Astrophysics Data System (ADS)

    Hernández-Bermejo, Benito; Brenig, Léon

    2002-07-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.

  5. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  6. Analysis and design of friction stir welding tool

    NASA Astrophysics Data System (ADS)

    Jagadeesha, C. B.

    2016-12-01

    Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.

  7. Development and psychometric evaluation of the Impact of Health Information Technology (I-HIT) scale.

    PubMed

    Dykes, Patricia C; Hurley, Ann; Cashen, Margaret; Bakken, Suzanne; Duffy, Mary E

    2007-01-01

    The use of health information technology (HIT) for the support of communication processes and data and information access in acute care settings is a relatively new phenomenon. A means of evaluating the impact of HIT in hospital settings is needed. The purpose of this research was to design and psychometrically evaluate the Impact of Health Information Technology scale (I-HIT). I-HIT was designed to measure the perception of nurses regarding the ways in which HIT influences interdisciplinary communication and workflow patterns and nurses' satisfaction with HIT applications and tools. Content for a 43-item tool was derived from the literature, and supported theoretically by the Coiera model and by nurse informaticists. Internal consistency reliability analysis using Cronbach's alpha was conducted on the 43-item scale to initiate the item reduction process. Items with an item total correlation of less than 0.35 were removed, leaving a total of 29 items. Item analysis, exploratory principal component analysis and internal consistency reliability using Cronbach's alpha were used to confirm the 29-item scale. Principal components analysis with Varimax rotation produced a four-factor solution that explained 58.5% of total variance (general advantages, information tools to support information needs, information tools to support communication needs, and workflow implications). Internal consistency of the total scale was 0.95 and ranged from 0.80-0.89 for four subscales. I-HIT demonstrated psychometric adequacy and is recommended to measure the impact of HIT on nursing practice in acute care settings.

  8. 12 CFR 223.16 - What transactions by a member bank with any person are treated as transactions with an affiliate?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... securities from or through an affiliate of the member bank. (4) General purpose credit card transactions. (i... imposed in, a general purpose credit card issued by the member bank to the nonaffiliate. (ii) Definition. “General purpose credit card” means a credit card issued by a member bank that is widely accepted by...

  9. 12 CFR 223.16 - What transactions by a member bank with any person are treated as transactions with an affiliate?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... securities from or through an affiliate of the member bank. (4) General purpose credit card transactions. (i... imposed in, a general purpose credit card issued by the member bank to the nonaffiliate. (ii) Definition. “General purpose credit card” means a credit card issued by a member bank that is widely accepted by...

  10. 12 CFR 223.16 - What transactions by a member bank with any person are treated as transactions with an affiliate?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... securities from or through an affiliate of the member bank. (4) General purpose credit card transactions. (i... imposed in, a general purpose credit card issued by the member bank to the nonaffiliate. (ii) Definition. “General purpose credit card” means a credit card issued by a member bank that is widely accepted by...

  11. 12 CFR 223.16 - What transactions by a member bank with any person are treated as transactions with an affiliate?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... securities from or through an affiliate of the member bank. (4) General purpose credit card transactions. (i... imposed in, a general purpose credit card issued by the member bank to the nonaffiliate. (ii) Definition. “General purpose credit card” means a credit card issued by a member bank that is widely accepted by...

  12. 12 CFR 223.16 - What transactions by a member bank with any person are treated as transactions with an affiliate?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... securities from or through an affiliate of the member bank. (4) General purpose credit card transactions. (i... imposed in, a general purpose credit card issued by the member bank to the nonaffiliate. (ii) Definition. “General purpose credit card” means a credit card issued by a member bank that is widely accepted by...

  13. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  14. [Food photography atlas: its suitability for quantifying food and nutrient consumption in nutritional epidemiological research in Córdoba, Argentina].

    PubMed

    Navarro, A; Cristaldo, P E; Díaz, M P; Eynard, A R

    2000-01-01

    Food pictures are suitable visual tools for quantize food and nutrient consumption avoiding bias due to self-assessments. To determine the perception of food portion size and to establish the efficacy of food pictures for dietaries assessments. A food frequency questionnaire (FFQ) including 118 food items of daily consumption was applied to 30 adults representative of Córdoba, Argentina, population. Among several food models (paper maché, plastics) and pictures, those which more accurately filled the purpose were selected. 3 small, median and large standard portion size were determined. Data were evaluated with descriptive statistics tools and Chi square adherence test. The assessment of 51 percent of the food was assayed in concordance with the reference size. In general, the remainder was overestimated. The 90 percent of volunteers concluded that the pictures were the best visual resource. The photographic atlas of food is an useful material for quantize the dietary consumption, suitable for many types of dietaries assessments. In conclusion, comparison among pictures of three portions previously standardized for each food is highly recommendable.

  15. StrBioLib: a Java library for development of custom computational structural biology applications.

    PubMed

    Chandonia, John-Marc

    2007-08-01

    StrBioLib is a library of Java classes useful for developing software for computational structural biology research. StrBioLib contains classes to represent and manipulate protein structures, biopolymer sequences, sets of biopolymer sequences, and alignments between biopolymers based on either sequence or structure. Interfaces are provided to interact with commonly used bioinformatics applications, including (psi)-blast, modeller, muscle and Primer3, and tools are provided to read and write many file formats used to represent bioinformatic data. The library includes a general-purpose neural network object with multiple training algorithms, the Hooke and Jeeves non-linear optimization algorithm, and tools for efficient C-style string parsing and formatting. StrBioLib is the basis for the Pred2ary secondary structure prediction program, is used to build the astral compendium for sequence and structure analysis, and has been extensively tested through use in many smaller projects. Examples and documentation are available at the site below. StrBioLib may be obtained under the terms of the GNU LGPL license from http://strbio.sourceforge.net/

  16. AOFlagger: RFI Software

    NASA Astrophysics Data System (ADS)

    Offringa, A. R.

    2010-10-01

    The RFI software presented here can automatically flag data and can be used to analyze the data in a measurement. The purpose of flagging is to mark samples that are affected by interfering sources such as radio stations, airplanes, electrical fences or other transmitting interferers. The tools in the package are meant for offline use. The software package contains a graphical interface ("rfigui") that can be used to visualize a measurement set and analyze mitigation techniques. It also contains a console flagger ("rficonsole") that can execute a script of mitigation functions without the overhead of a graphical environment. All tools were written in C++. The software has been tested extensively on low radio frequencies (150 MHz or lower) produced by the WSRT and LOFAR telescopes. LOFAR is the Low Frequency Array that is built in and around the Netherlands. Higher frequencies should work as well. Some of the methods implemented are the SumThreshold, the VarThreshold and the singular value decomposition (SVD) method. Included also are several surface fitting algorithms. The software is published under the GNU General Public License version 3.

  17. Multi -omics and metabolic modelling pipelines: challenges and tools for systems microbiology.

    PubMed

    Fondi, Marco; Liò, Pietro

    2015-02-01

    Integrated -omics approaches are quickly spreading across microbiology research labs, leading to (i) the possibility of detecting previously hidden features of microbial cells like multi-scale spatial organization and (ii) tracing molecular components across multiple cellular functional states. This promises to reduce the knowledge gap between genotype and phenotype and poses new challenges for computational microbiologists. We underline how the capability to unravel the complexity of microbial life will strongly depend on the integration of the huge and diverse amount of information that can be derived today from -omics experiments. In this work, we present opportunities and challenges of multi -omics data integration in current systems biology pipelines. We here discuss which layers of biological information are important for biotechnological and clinical purposes, with a special focus on bacterial metabolism and modelling procedures. A general review of the most recent computational tools for performing large-scale datasets integration is also presented, together with a possible framework to guide the design of systems biology experiments by microbiologists. Copyright © 2015. Published by Elsevier GmbH.

  18. Science data visualization in planetary and heliospheric contexts with 3DView

    NASA Astrophysics Data System (ADS)

    Génot, V.; Beigbeder, L.; Popescu, D.; Dufourg, N.; Gangloff, M.; Bouchemit, M.; Caussarieu, S.; Toniutti, J.-P.; Durand, J.; Modolo, R.; André, N.; Cecconi, B.; Jacquey, C.; Pitout, F.; Rouillard, A.; Pinto, R.; Erard, S.; Jourdane, N.; Leclercq, L.; Hess, S.; Khodachenko, M.; Al-Ubaidi, T.; Scherf, M.; Budnik, E.

    2018-01-01

    We present a 3D orbit viewer application capable of displaying science data. 3DView, a web tool designed by the French Plasma Physics Data Center (CDPP) for the planetology and heliophysics community, has extended functionalities to render space physics data (observations and models alike) in their original 3D context. Time series, vectors, dynamic spectra, celestial body maps, magnetic field or flow lines, 2D cuts in simulation cubes, etc, are among the variety of data representation enabled by 3DView. The direct connection to several large databases, the use of VO standards and the possibility to upload user data makes 3DView a versatile tool able to cover a wide range of space physics contexts. The code is open source and the software is regularly used at Masters Degree level or summer school for pedagogical purposes. The present paper describes the general architecture and all major functionalities, and offers several science cases (simulation rendering, mission preparation, etc.) which can be easily replayed by the interested readers. Future developments are finally outlined.

  19. Liquid chromatography coupled with time-of-flight and ion trap mass spectrometry for qualitative analysis of herbal medicines.

    PubMed

    Chen, Xiao-Fei; Wu, Hai-Tang; Tan, Guang-Guo; Zhu, Zhen-Yu; Chai, Yi-Feng

    2011-11-01

    With the expansion of herbal medicine (HM) market, the issue on how to apply up-to-date analytical tools on qualitative analysis of HMs to assure their quality, safety and efficacy has been arousing great attention. Due to its inherent characteristics of accurate mass measurements and multiple stages analysis, the integrated strategy of liquid chromatography (LC) coupled with time-of-flight mass spectrometry (TOF-MS) and ion trap mass spectrometry (IT-MS) is well-suited to be performed as qualitative analysis tool in this field. The purpose of this review is to provide an overview on the potential of this integrated strategy, including the review of general features of LC-IT-MS and LC-TOF-MS, the advantages of their combination, the common procedures for structure elucidation, the potential of LC-hybrid-IT-TOF/MS and also the summary and discussion of the applications of the integrated strategy for HM qualitative analysis (2006-2011). The advantages and future developments of LC coupled with IT and TOF-MS are highlighted.

  20. The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments

    NASA Astrophysics Data System (ADS)

    Agostini, M.; Detwiler, J. A.; Finnerty, P.; Kröninger, K.; Lenz, D.; Liu, J.; Marino, M. G.; Martin, R.; Nguyen, K. D.; Pandola, L.; Schubert, A. G.; Volynets, O.; Zavarise, P.

    2012-07-01

    The Gerda and Majorana experiments will search for neutrinoless double-beta decay of 76Ge using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both Gerda and Majorana.

Top