Sample records for development tool called

  1. Facility Composer (Trademark) and PACES (Trademark) Integration: Development of an XML Interface Based on Industry Foundation Classes

    DTIC Science & Technology

    2007-11-01

    Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.

  3. VIPER: a web application for rapid expert review of variant calls.

    PubMed

    Wöste, Marius; Dugas, Martin

    2018-06-01

    With the rapid development in next-generation sequencing, cost and time requirements for genomic sequencing are decreasing, enabling applications in many areas such as cancer research. Many tools have been developed to analyze genomic variation ranging from single nucleotide variants to whole chromosomal aberrations. As sequencing throughput increases, the number of variants called by such tools also grows. Often employed manual inspection of such calls is thus becoming a time-consuming procedure. We developed the Variant InsPector and Expert Rating tool (VIPER) to speed up this process by integrating the Integrative Genomics Viewer into a web application. Analysts can then quickly iterate through variants, apply filters and make decisions based on the generated images and variant metadata. VIPER was successfully employed in analyses with manual inspection of more than 10 000 calls. VIPER is implemented in Java and Javascript and is freely available at https://github.com/MarWoes/viper. marius.woeste@uni-muenster.de. Supplementary data are available at Bioinformatics online.

  4. SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)

    EPA Science Inventory

    The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...

  5. Teaching Web Security Using Portable Virtual Labs

    ERIC Educational Resources Information Center

    Chen, Li-Chiou; Tao, Lixin

    2012-01-01

    We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…

  6. SMARTE: IMPROVING REVITALIZATION DECISIONS (BERLIN, GERMANY)

    EPA Science Inventory

    The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...

  7. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  8. Usability Tests in CALL Development: Pilot Studies in the Context of the Dire autrement and Francotoile Projects

    ERIC Educational Resources Information Center

    Hamel, Marie-Josee; Caws, Catherine

    2010-01-01

    This article discusses CALL development from both educational and ergonomic perspectives. It focuses on the learner-task-tool interaction, in particular on the aspects contributing to its overall quality, herein called "usability." Two pilot studies are described that were carried out with intermediate to advanced learners of French in two…

  9. Developing Multimedia Courseware for the Internet's Java versus Shockwave.

    ERIC Educational Resources Information Center

    Majchrzak, Tina L.

    1996-01-01

    Describes and compares two methods for developing multimedia courseware for use on the Internet: an authoring tool called Shockwave, and an object-oriented language called Java. Topics include vector graphics, browsers, interaction with network protocols, data security, multithreading, and computer languages versus development environments. (LRW)

  10. ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES

    EPA Science Inventory

    Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...

  11. How Much Professional Development Is Enough? Meeting the Needs of Independent Music Teachers Learning to Use a Digital Tool

    ERIC Educational Resources Information Center

    Upitis, Rena; Brook, Julia

    2017-01-01

    Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…

  12. CALL in the Zone of Proximal Development: Novelty Effects and Teacher Guidance

    ERIC Educational Resources Information Center

    Karlström, Petter; Lundin, Eva

    2013-01-01

    Digital tools are not always used in the manner their designers had in mind. Therefore, it is not enough to assume that learning through CALL tools occurs in intended ways, if at all. We have studied the use of an enhanced word processor for writing essays in Swedish as a second language. The word processor contained natural language processing…

  13. Decades of CALL Development: A Retrospective of the Work of James Pusack and Sue Otto

    ERIC Educational Resources Information Center

    Otto, Sue E. K.

    2010-01-01

    In this article, the author describes a series of projects that James Pusack and the author engaged in together, a number of them to develop CALL authoring tools. With their shared love of technology and dedication to language teaching and learning, they embarked on a long and immensely enjoyable career in CALL during which each project evolved…

  14. FFI: What it is and what it can do for you

    Treesearch

    Duncan C. Lutes; MaryBeth Keifer; Nathan C. Benson; John F. Caratti

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool (FEAT). FFI provides...

  15. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  16. Ketso: A New Tool for Extension Professionals

    ERIC Educational Resources Information Center

    Bates, James S.

    2016-01-01

    Extension professionals employ many techniques and tools to obtain feedback, input, information, and data from stakeholders, research participants, and program learners. An information-gathering tool called Ketso is described in this article. This tool and its associated techniques can be used in all phases of program development, implementation,…

  17. Web 2.0 in Computer-Assisted Language Learning: A Research Synthesis and Implications for Instructional Design and Educational Practice

    ERIC Educational Resources Information Center

    Parmaxi, Antigoni; Zaphiris, Panayiotis

    2017-01-01

    This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…

  18. Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN)

    EPA Science Inventory

    U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...

  19. Skill Transfer and Virtual Training for IND Response Decision-Making: Models for Government-Industry Collaboration for the Development of Game-Based Training Tools

    DTIC Science & Technology

    2016-04-01

    IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based Training Tools R.M. Seater C.E. Rose...Models for Government–Industry Collaboration for the Development of Game -Based Training Tools C.E. Rose A.S. Norige Group 44 R.M. Seater K.C...Report 1208 Lexington Massachusetts This page intentionally left blank. iii EXECUTIVE SUMMARY Game -based training tools, sometimes called “serious

  20. Rapid E-Learning Simulation Training and User Response

    ERIC Educational Resources Information Center

    Rackler, Angeline

    2011-01-01

    A new trend in e-learning development is to have subject matter experts use rapid development tools to create training simulations. This type of training is called rapid e-learning simulation training. Though companies are using rapid development tools to create training quickly and cost effectively, there is little empirical research to indicate…

  1. A Management Tool Kit on Training Needs Assessment and Programme Design: An Integrated Resource for Management Development in Transition Countries. Companion.

    ERIC Educational Resources Information Center

    European Training Foundation, Turin (Italy).

    This document presents a management tool kit on training needs assessment and program design for countries in transition to a market economy. Chapter 1 describes the tool's development within the framework of the project called Strengthening of Partnership between Management Training Institutions and Companies, Ukraine-Kazakhstan-Kyrgyzstan.…

  2. Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN) - slides

    EPA Science Inventory

    U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...

  3. Web Development Simplified

    ERIC Educational Resources Information Center

    Becker, Bernd W.

    2010-01-01

    The author has discussed the Multimedia Educational Resource for Teaching and Online Learning site, MERLOT, in a recent Electronic Roundup column. In this article, he discusses an entirely new Web page development tool that MERLOT has added for its members. The new tool is called the MERLOT Content Builder and is directly integrated into the…

  4. Developing an Indigenous Proficiency Scale

    ERIC Educational Resources Information Center

    Kahakalau, Ku

    2017-01-01

    With an increased interest in the revitalization of Indigenous languages and cultural practices worldwide, there is also an increased need to develop tools to support Indigenous language learners and instructors. The purpose of this article is to presents such a tool called ANA 'OLELO, designed specifically to assess Hawaiian language proficiency.…

  5. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  6. New Tool for Benefit-Cost Analysis in Evaluating Transportation Alternatives

    DOT National Transportation Integrated Search

    1997-01-01

    The Intermodal Surface Transportation Efficiency Act (ISTEA) emphasizes assessment of multi-modal alternatives and demand management strategies. In 1995, the Federal Highway Administration (FHWA) developed a corridor sketch planning tool called the S...

  7. Overview of Current Activities in Combustion Instability

    DTIC Science & Technology

    2015-10-02

    and avoid liquid rocket engine combustion stability problems Approach:  1) Develop a  SOA  combustion stability software package  called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion

  8. Rapid Response to Decision Making for Complex Issues - How Technologies of Cooperation Can Help

    DTIC Science & Technology

    2005-11-01

    creating bottom–up taxonomies—called folksonomies —using metadata tools like del.icio.us (in which users create their own tags for bookmarking Web...tools such as RSS, tagging (and the consequent development of folksonomies ), wikis, and group visualization tools all help multiply the individual

  9. Skill Transfer and Virtual Training for IND Response Decision-Making: Models for Government-Industry Collaboration for the Development of Game-Based Training Tools

    DTIC Science & Technology

    2016-05-05

    Training for IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based Training Tools R.M. Seater...Skill Transfer and Virtual Training for IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based...unlimited. This page intentionally left blank. iii EXECUTIVE SUMMARY Game -based training tools, sometimes called “serious games ,” are becoming

  10. Measuring the Style of Innovative Thinking among Engineering Students

    ERIC Educational Resources Information Center

    Passig, David; Cohen, Lizi

    2014-01-01

    Background: Many tools have been developed to measure the ability of workers to innovate. However, all of them are based on self-reporting questionnaires, which raises questions about their validity Purpose: The aim was to develop and validate a tool, called Ideas Generation Implementation (IGI), to objectively measure the style and potential of…

  11. Program Evaluation: The Board Game--An Interactive Learning Tool for Evaluators

    ERIC Educational Resources Information Center

    Febey, Karen; Coyne, Molly

    2007-01-01

    The field of program evaluation lacks interactive teaching tools. To address this pedagogical issue, the authors developed a collaborative learning technique called Program Evaluation: The Board Game. The authors present the game and its development in this practitioner-oriented article. The evaluation board game is an adaptable teaching tool…

  12. A front-end automation tool supporting design, verification and reuse of SOC.

    PubMed

    Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing

    2004-09-01

    This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.

  13. Engaging landowners effectively: creating a Call Before You Cut campaign in the central hardwoods region

    Treesearch

    Mary L. Tyrrell; David Apsley; Purnima Chawla; Brett Butler

    2013-01-01

    Social marketing tools and approaches were used to develop a Call Before You Cut campaign for six states in the Central Hardwoods Region (Illinois, Indiana, Iowa, Missouri, Ohio, West Virginia). The campaign was developed from research on landowner values, objectives, and behavior, based on National Woodland Owner Survey data and landowner focus groups, and discussions...

  14. Enhancement Approachof Object Constraint Language Generation

    NASA Astrophysics Data System (ADS)

    Salemi, Samin; Selamat, Ali

    2018-01-01

    OCL is the most prevalent language to document system constraints that are annotated in UML. Writing OCL specifications is not an easy task due to the complexity of the OCL syntax. Therefore, an approach to help and assist developers to write OCL specifications is needed. There are two approaches to do so: First, creating an OCL specifications by a tool called COPACABANA. Second, an MDA-based approach to help developers in writing OCL specification by another tool called NL2OCLviaSBVR that generates OCL specification automatically. This study presents another MDA-based approach called En2OCL, and its objective is twofold. 1- to improve the precison of the existing works. 2- to present a benchmark of these approaches. The benchmark shows that the accuracy of COPACABANA, NL2OCLviaSBVR, and En2OCL are 69.23, 84.64, and 88.40 respectively.

  15. TBell: A mathematical tool for analyzing decision tables

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Chen, Zewei

    1994-01-01

    This paper describes the development of mathematical theory and software to analyze specifications that are developed using decision tables. A decision table is a tabular format for specifying a complex set of rules that chooses one of a number of alternative actions. The report also describes a prototype tool, called TBell, that automates certain types of analysis.

  16. Interactivity Technologies to Improve the Learning in Classrooms through the Cloud

    ERIC Educational Resources Information Center

    Fardoun, Habib M.; Alghazzawi, Daniyal M.; Paules, Antonio

    2018-01-01

    In this paper, the authors present a cloud system that incorporate tools developed in HTML5 and JQuery technologies, which are offered to professors and students in the development of a teaching methodology called flipped classroom, where the theoretical content is usually delivered by video files and self-assessment tools that students can…

  17. Researching Travel Behavior and Adaptability: Using a Virtual Reality Role-Playing Game

    ERIC Educational Resources Information Center

    Watcharasukarn, Montira; Krumdieck, Susan; Green, Richard; Dantas, Andre

    2011-01-01

    This article describes a virtual reality role-playing game that was developed as a survey tool to collect travel behavior data and explore and monitor travel behavior adaptation. The Advanced Energy and Material Systems Laboratory has designed, developed a prototype, and tested such a game platform survey tool, called Travel Activity Constraint…

  18. How freight moves : estimating milage and routes using an innovative GIS tool

    DOT National Transportation Integrated Search

    2007-06-01

    The Bureau of Transportation Statistics (BTS) has developed an innovative software tool, called GeoMiler, that is helping researchers better estimate freight travel. GeoMiler is being used to compute mileages along likely routes for the nearly 6 mill...

  19. Feasibility of lane closures using probe data : technical brief.

    DOT National Transportation Integrated Search

    2017-04-01

    This study developed an on-line system analysis tool called the Work Zone Interactive : Management Application - Planning (WIMAP-P), an easy-to-use and easy-to-learn tool for : predicting the traffic impact caused by work zone lane closures on freewa...

  20. On the Edge: Intelligent CALL in the 1990s.

    ERIC Educational Resources Information Center

    Underwood, John

    1989-01-01

    Examines the possibilities of developing computer-assisted language learning (CALL) based on the best of modern technology, arguing that artificial intelligence (AI) strategies will radically improve the kinds of exercises that can be performed. Recommends combining AI technology with other tools for delivering instruction, such as simulation and…

  1. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."

  2. Active Reading Documents (ARDs): A Tool to Facilitate Meaningful Learning through Reading

    ERIC Educational Resources Information Center

    Dubas, Justin M.; Toledo, Santiago A.

    2015-01-01

    Presented here is a practical tool called the Active Reading Document (ARD) that can give students the necessary incentive to engage with the text/readings. By designing the tool to incrementally develop student understanding of the material through reading using Marzano's Taxonomy as a framework, the ARD offers support through scaffolding as they…

  3. A call for benchmarking transposable element annotation methods.

    PubMed

    Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu

    2015-01-01

    DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.

  4. GOES-R AWG GLM Val Tool Development

    NASA Technical Reports Server (NTRS)

    Bateman, Monte; Mach, Douglas; Goodman, Steve; Blakeslee, Richard; Koshak, William

    2012-01-01

    We are developing tools needed to enable the validation of the Geostationary Lightning Mapper (GLM). In order to develop and test these tools, we have need of a robust, high-fidelity set of GLM proxy data. Many steps have been taken to ensure that the proxy data are high quality. LIS is the closest analog that exists for GLM, so it has been used extensively in developing the GLM proxy. We have verified the proxy data both statistically and algorithmically. The proxy data are pixel (event) data, called Level 1B. These data were then clustered into flashes by the Lightning Cluster-Filter Algorithm (LCFA), generating proxy Level 2 data. These were then compared with the data used to generate the proxy, and both the proxy data and the LCFA were validated. We have developed tools to allow us to visualize and compare the GLM proxy data with several other sources of lightning and other meteorological data (the so-called shallow-dive tool). The shallow-dive tool shows storm-level data and can ingest many different ground-based lightning detection networks, including: NLDN, LMA, WWLLN, and ENTLN. These are presented in a way such that it can be seen if the GLM is properly detecting the lightning in location and time comparable to the ground-based networks. Currently in development is the deep-dive tool, which will allow us to dive into the GLM data, down to flash, group and event level. This will allow us to assess performance in comparison with other data sources, and tell us if there are detection, timing, or geolocation problems. These tools will be compatible with the GLM Level-2 data format, so they can be used beginning on Day 0.

  5. Current Capabilities and Planned Enhancements of SUSTAIN - Paper

    EPA Science Inventory

    Efforts have been under way by the U.S. Environmental Protection Agency (EPA) since 2003 to develop a decision-support tool for placement of best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment ...

  6. A GIS-BASED MODAL MODEL OF AUTOMOBILE EXHAUST EMISSIONS

    EPA Science Inventory

    The report presents progress toward the development of a computer tool called MEASURE, the Mobile Emission Assessment System for Urban and Regional Evaluation. The tool works toward a goal of providing researchers and planners with a way to assess new mobile emission mitigation s...

  7. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  8. The Development of a Situational Judgement Test of Personal Attributes for Quality Teaching in Rural and Remote Australia

    ERIC Educational Resources Information Center

    Durksen, Tracy L.; Klassen, Robert M.

    2018-01-01

    Education authorities in Australia are calling for valid tools to help assess prospective teachers' non-academic attributes, with a particular need for identifying those attributes necessary for effective teaching in specific contexts. With the New South Wales (NSW) Department of Education, we aimed to develop a scenario-based tool to help assess…

  9. Voice and gesture-based 3D multimedia presentation tool

    NASA Astrophysics Data System (ADS)

    Fukutake, Hiromichi; Akazawa, Yoshiaki; Okada, Yoshihiro

    2007-09-01

    This paper proposes a 3D multimedia presentation tool that allows the user to manipulate intuitively only through the voice input and the gesture input without using a standard keyboard or a mouse device. The authors developed this system as a presentation tool to be used in a presentation room equipped a large screen like an exhibition room in a museum because, in such a presentation environment, it is better to use voice commands and the gesture pointing input rather than using a keyboard or a mouse device. This system was developed using IntelligentBox, which is a component-based 3D graphics software development system. IntelligentBox has already provided various types of 3D visible, reactive functional components called boxes, e.g., a voice input component and various multimedia handling components. IntelligentBox also provides a dynamic data linkage mechanism called slot-connection that allows the user to develop 3D graphics applications by combining already existing boxes through direct manipulations on a computer screen. Using IntelligentBox, the 3D multimedia presentation tool proposed in this paper was also developed as combined components only through direct manipulations on a computer screen. The authors have already proposed a 3D multimedia presentation tool using a stage metaphor and its voice input interface. This time, we extended the system to make it accept the user gesture input besides voice commands. This paper explains details of the proposed 3D multimedia presentation tool and especially describes its component-based voice and gesture input interfaces.

  10. Electronic Assessment and Feedback Tool in Supervision of Nursing Students during Clinical Training

    ERIC Educational Resources Information Center

    Mettiäinen, Sari

    2015-01-01

    The aim of this study was to determine nursing teachers' and students' attitudes to and experiences of using an electronic assessment and feedback tool in supervision of clinical training. The tool was called eTaitava, and it was developed in Finland. During the pilot project, the software was used by 12 nursing teachers and 430 nursing students.…

  11. Instrumentalism, Ideals and Imaginaries: Theorising the Contested Space of Global Citizenship Education in Schools

    ERIC Educational Resources Information Center

    Marshall, Harriet

    2011-01-01

    This paper exposes the tensions between different agendas and calls for what is loosely called "global citizenship education" by developing a set of sociological conceptual tools useful for engaging with associated educational forms and ideals. It presents the instrumentalist and normative agendas at play within global citizenship education…

  12. Expanding the KATE toolbox

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1993-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.

  13. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  14. SACA: Software Assisted Call Analysis--an interactive tool supporting content exploration, online guidance and quality improvement of counseling dialogues.

    PubMed

    Trinkaus, Hans L; Gaisser, Andrea E

    2010-09-01

    Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  15. GuidosToolbox: universal digital image object analysis

    Treesearch

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  16. Assessing the Use of Instant Messaging in Online Learning Environments

    ERIC Educational Resources Information Center

    Contreras-Castillo, Juan; Perez-Fragoso, Carmen; Favela, Jesus

    2006-01-01

    There is a body of evidence supporting the claim that informal interactions in educational environments have positive effects on learning. In order to increase the opportunities of informal interaction in online courses, an instant messaging tool, called CENTERS, was developed and integrated into online learning environments. This tool provides…

  17. Psychological Factors Associated with Paranursing Expertise.

    ERIC Educational Resources Information Center

    Brammer, Robert; Haller, Katherine

    The psychological factors associated with paranursing expertise were examined in a study of 135 certified nursing assistants (CNAs) at a geriatric nursing facility in Amarillo, Texas. Data were collected through a project-developed screening tool called the Geriatric Employee Screening Tool (GEST), which is a true-false instrument patterned after…

  18. BS-virus-finder: virus integration calling using bisulfite sequencing data.

    PubMed

    Gao, Shengjie; Hu, Xuesong; Xu, Fengping; Gao, Changduo; Xiong, Kai; Zhao, Xiao; Chen, Haixiao; Zhao, Shancen; Wang, Mengyao; Fu, Dongke; Zhao, Xiaohui; Bai, Jie; Mao, Likai; Li, Bo; Wu, Song; Wang, Jian; Li, Shengbin; Yang, Huangming; Bolund, Lars; Pedersen, Christian N S

    2018-01-01

    DNA methylation plays a key role in the regulation of gene expression and carcinogenesis. Bisulfite sequencing studies mainly focus on calling single nucleotide polymorphism, different methylation region, and find allele-specific DNA methylation. Until now, only a few software tools have focused on virus integration using bisulfite sequencing data. We have developed a new and easy-to-use software tool, named BS-virus-finder (BSVF, RRID:SCR_015727), to detect viral integration breakpoints in whole human genomes. The tool is hosted at https://github.com/BGI-SZ/BSVF. BS-virus-finder demonstrates high sensitivity and specificity. It is useful in epigenetic studies and to reveal the relationship between viral integration and DNA methylation. BS-virus-finder is the first software tool to detect virus integration loci by using bisulfite sequencing data. © The Authors 2017. Published by Oxford University Press.

  19. Luigi Gentile Polese | NREL

    Science.gov Websites

    software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements

  20. Organisational capacity and its relationship to research use in six Australian health policy agencies

    PubMed Central

    Makkar, Steve R.; Haynes, Abby; Williamson, Anna; Redman, Sally

    2018-01-01

    There are calls for policymakers to make greater use of research when formulating policies. Therefore, it is important that policy organisations have a range of tools and systems to support their staff in using research in their work. The aim of the present study was to measure the extent to which a range of tools and systems to support research use were available within six Australian agencies with a role in health policy, and examine whether this was related to the extent of engagement with, and use of research in policymaking by their staff. The presence of relevant systems and tools was assessed via a structured interview called ORACLe which is conducted with a senior executive from the agency. To measure research use, four policymakers from each agency undertook a structured interview called SAGE, which assesses and scores the extent to which policymakers engaged with (i.e., searched for, appraised, and generated) research, and used research in the development of a specific policy document. The results showed that all agencies had at least a moderate range of tools and systems in place, in particular policy development processes; resources to access and use research (such as journals, databases, libraries, and access to research experts); processes to generate new research; and mechanisms to establish relationships with researchers. Agencies were less likely, however, to provide research training for staff and leaders, or to have evidence-based processes for evaluating existing policies. For the majority of agencies, the availability of tools and systems was related to the extent to which policymakers engaged with, and used research when developing policy documents. However, some agencies did not display this relationship, suggesting that other factors, namely the organisation’s culture towards research use, must also be considered. PMID:29513669

  1. Organisational capacity and its relationship to research use in six Australian health policy agencies.

    PubMed

    Makkar, Steve R; Haynes, Abby; Williamson, Anna; Redman, Sally

    2018-01-01

    There are calls for policymakers to make greater use of research when formulating policies. Therefore, it is important that policy organisations have a range of tools and systems to support their staff in using research in their work. The aim of the present study was to measure the extent to which a range of tools and systems to support research use were available within six Australian agencies with a role in health policy, and examine whether this was related to the extent of engagement with, and use of research in policymaking by their staff. The presence of relevant systems and tools was assessed via a structured interview called ORACLe which is conducted with a senior executive from the agency. To measure research use, four policymakers from each agency undertook a structured interview called SAGE, which assesses and scores the extent to which policymakers engaged with (i.e., searched for, appraised, and generated) research, and used research in the development of a specific policy document. The results showed that all agencies had at least a moderate range of tools and systems in place, in particular policy development processes; resources to access and use research (such as journals, databases, libraries, and access to research experts); processes to generate new research; and mechanisms to establish relationships with researchers. Agencies were less likely, however, to provide research training for staff and leaders, or to have evidence-based processes for evaluating existing policies. For the majority of agencies, the availability of tools and systems was related to the extent to which policymakers engaged with, and used research when developing policy documents. However, some agencies did not display this relationship, suggesting that other factors, namely the organisation's culture towards research use, must also be considered.

  2. Student Evaluation of CALL Tools during the Design Process

    ERIC Educational Resources Information Center

    Nesbitt, Dallas

    2013-01-01

    This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…

  3. Accuracy of a Screening Tool for Early Identification of Language Impairment

    ERIC Educational Resources Information Center

    Uilenburg, Noëlle; Wiefferink, Karin; Verkerk, Paul; van Denderen, Margot; van Schie, Carla; Oudesluys-Murphy, Ann-Marie

    2018-01-01

    Purpose: A screening tool called the "VTO Language Screening Instrument" (VTO-LSI) was developed to enable more uniform and earlier detection of language impairment. This report, consisting of 2 retrospective studies, focuses on the effects of using the VTO-LSI compared to regular detection procedures. Method: Study 1 retrospectively…

  4. Using Appreciative Inquiry to Facilitate Implementation of the Recovery Model in Mental Health Agencies

    ERIC Educational Resources Information Center

    Clossey, Laurene; Mehnert, Kevin; Silva, Sara

    2011-01-01

    This article describes an organizational development tool called appreciative inquiry (AI) and its use in mental health to aid agencies implementing recovery model services. AI is a discursive tool with the power to shift dominant organizational cultures. Its philosophical underpinnings emphasize values consistent with recovery: community,…

  5. SUSTAIN - A BMP Process and Placement Tool for Urban Watersheds (Poster)

    EPA Science Inventory

    To assist stormwater management professionals in planning for best management practices (BMPs) and low-impact developments (LIDs) implementation, USEPA is developing a decision support system, called the System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN). ...

  6. Seismic hazard assessment of Oregon highway truck routes.

    DOT National Transportation Integrated Search

    2012-06-01

    This research project developed a seismic risk assessment model along the major truck routes in Oregon. The study had adopted federally : developed software tools called Risk for Earthquake Damage to Roadway Systems (REDARS2) and HAZUS-MH. The model ...

  7. Flight Awareness Collaboration Tool Development

    NASA Technical Reports Server (NTRS)

    Mogford, Richard

    2016-01-01

    This is a PowerPoint presentation covering airline operations center (AOC) research. It reviews a dispatcher decision support tool called the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. FACT should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations.

  8. Java PathExplorer: A Runtime Verification Tool

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We describe recent work on designing an environment called Java PathExplorer for monitoring the execution of Java programs. This environment facilitates the testing of execution traces against high level specifications, including temporal logic formulae. In addition, it contains algorithms for detecting classical error patterns in concurrent programs, such as deadlocks and data races. An initial prototype of the tool has been applied to the executive module of the planetary Rover K9, developed at NASA Ames. In this paper we describe the background and motivation for the development of this tool, including comments on how it relates to formal methods tools as well as to traditional testing, and we then present the tool itself.

  9. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  10. Designing and Implementing an Online GIS Tool for Schools: The Finnish Case of the PaikkaOppi Project

    ERIC Educational Resources Information Center

    Riihelä, Juha; Mäki, Sanna

    2015-01-01

    This article describes initiatives implemented in Finland to create an online learning environment for studying geographic information systems (GIS). A development project produced an online GIS tool called PaikkaOppi, aimed at promoting GIS studies and spatial thinking skills in upper secondary schools. The project is reviewed through analysis of…

  11. Strengthening Ecological Mindfulness through Hybrid Learning in Vital Coalitions

    ERIC Educational Resources Information Center

    Sol, Jifke; Wals, Arjen E. J.

    2015-01-01

    In this contribution a key policy "tool" used in the Dutch Environmental Education and Learning for Sustainability Policy framework is introduced as a means to develop a sense of place and associated ecological mindfulness. The key elements of this tool, called the vital coalition, are described while an example of its use in practice,…

  12. Web-Based Course Delivery and Administration Using Scheme.

    ERIC Educational Resources Information Center

    Salustri, Filippo A.

    This paper discusses the use at the University of Windsor (Ontario) of a small World Wide Web-based tool for course delivery and administration called HAL (HTML-based Administrative Lackey), written in the Scheme programming language. This tool was developed by the author to provide Web-based services for a large first-year undergraduate course in…

  13. Using Empathic Identification as a Literacy Tool for Building Culturally Responsive Pedagogy with Preservice Teachers

    ERIC Educational Resources Information Center

    Gunn, AnnMarie Alberton; King, James R.

    2015-01-01

    This study explores how teaching cases that featured diversity and literacy issues and self-reflexive writing exercises called postcard narratives can be used as tools by teacher educators for developing a culturally responsive pedagogy with preservice teachers. This study used interviews with the professor, a journal kept by the professor, a…

  14. DDP - a tool for life-cycle risk management

    NASA Technical Reports Server (NTRS)

    Cornford, S. L.; Feather, M. S.; Hicks, K. A.

    2001-01-01

    At JPL we have developed, and implemented, a process for achieving life-cycle risk management. This process has been embodied in a software tool and is called Defect Detection and Prevention (DDP). The DDP process can be succinctly stated as: determine where we want to be, what could get in the way and how we will get there.

  15. Pecha Kucha Style Powerpoint Presentation: An Innovative Call Approach to Developing Oral Presentation Skills of Tertiary Students

    ERIC Educational Resources Information Center

    Murugaiah, Puvaneswary

    2016-01-01

    In computer-assisted language learning (CALL), technological tools are often used both as an end and as a means to an end (Levy & Stockwell, 2006). Microsoft PowerPoint is an example of the latter as it is commonly used in oral presentations in classrooms. However, many student presentations are often boring as students generally read from…

  16. Software Tool Issues

    NASA Astrophysics Data System (ADS)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  17. Teaching the Teacher: Tutoring SimStudent Leads to More Effective Cognitive Tutor Authoring

    ERIC Educational Resources Information Center

    Matsuda, Noboru; Cohen, William W.; Koedinger, Kenneth R.

    2015-01-01

    SimStudent is a machine-learning agent initially developed to help novice authors to create cognitive tutors without heavy programming. Integrated into an existing suite of software tools called Cognitive Tutor Authoring Tools (CTAT), SimStudent helps authors to create an expert model for a cognitive tutor by tutoring SimStudent on how to solve…

  18. Vision Algorithms Catch Defects in Screen Displays

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Andrew Watson, a senior scientist at Ames Research Center, developed a tool called the Spatial Standard Observer (SSO), which models human vision for use in robotic applications. Redmond, Washington-based Radiant Zemax LLC licensed the technology from NASA and combined it with its imaging colorimeter system, creating a powerful tool that high-volume manufacturers of flat-panel displays use to catch defects in screens.

  19. Multivariate Density Estimation and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1983-01-01

    Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.

  20. Vocabulary Notebook: A Digital Solution to General and Specific Vocabulary Learning Problems in a CLIL Context

    ERIC Educational Resources Information Center

    Bazo, Plácido; Rodríguez, Romén; Fumero, Dácil

    2016-01-01

    In this paper, we will introduce an innovative software platform that can be especially useful in a Content and Language Integrated Learning (CLIL) context. This tool is called Vocabulary Notebook, and has been developed to solve all the problems that traditional (paper) vocabulary notebooks have. This tool keeps focus on the personalisation of…

  1. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  2. The HexSim Model

    EPA Science Inventory

    HexSim version 2.0 is soon to be released by EPA's Western Ecology Division (WED). More than three years of work have gone into the development of this tool, which grew out of an EPA model called PATCH. HexSim makes it possible for non-programmers to develop sophisticated simula...

  3. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  4. Webpress: An Internet Outreach from NASA Dryden

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel J.

    1996-01-01

    The Technology and Commercialization Office at NASA DRyden has developed many educational outreach programs for K-12 educators. This project concentrates on the internet portion of that effort, specifically focusing on the development of an internet tool for educators called Webpress. This tool will not only provide a user-friendly access to aeronautical topics and interesting individuals on the world wide web (web), but will also enable teachers to rapidly submit and display their own materials and links for use in the classroom.

  5. APT, The Phase I tool for HST Cycle 12

    NASA Astrophysics Data System (ADS)

    Blacker, Brett S.; Bertch, Maria; Curtis, Gary; Douglas, Robert E., Jr.; Krueger, Anthony P.

    2002-12-01

    In the continuing effort to streamline our systems and improve service to the science community, the Space Telescope Science Institute (STScI) is developing and releasing, APT The Astronomer’s Proposal Tool as the new interface for Hubble Space Telescope (HST) Phase I and Phase II proposal submissions for HST Cycle 12. APT, was formerly called the Scientist’s Expert Assistant (SEA), which started as a prototype effort to try and bring state of the art technology, more visual tools and power into the hands of proposers so that they can optimize the scientific return of their programs as well as HST. Proposing for HST and other missions, consists of requesting observing time and/or archival research funding. This step is called Phase I, where the scientific merit of a proposal is considered by a community based peer-review process. Accepted proposals then proceed thru Phase II, where the observations are specified in sufficient detail to enable scheduling on the telescope. In this paper, we will present our concept and implementation plans for our Phase I development and submission tool, APT. More importantly, we will go behind the scenes and discuss why it’s important for the Science Policies Division (SPD) and other groups at the STScI to have a new submission tool and submission output products. This paper is an update of the status of the HST Phase I Proposal Processing System that was described in the published paper “A New Era for HST Phase I Development and Submission.”

  6. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  7. 77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...

  8. Monte Carlo-based searching as a tool to study carbohydrate structure

    USDA-ARS?s Scientific Manuscript database

    A torsion angle-based Monte-Carlo searching routine was developed and applied to several carbohydrate modeling problems. The routine was developed as a Unix shell script that calls several programs, which allows it to be interfaced with multiple potential functions and various functions for evaluat...

  9. Developing the E-Delphi System: A Web-Based Forecasting Tool for Educational Research.

    ERIC Educational Resources Information Center

    Chou, Chien

    2002-01-01

    Discusses use of the Delphi technique and describes the development of an electronic version, called e-Delphi, in which questionnaire construction and communication with panel members was accomplished using the Web. Explains system function and interface and discusses evaluation of the e-Delphi system. (Author/LRW)

  10. DARPA Initiative in Concurrent Engineering (DICE). Phase 2

    DTIC Science & Technology

    1990-07-31

    XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for

  11. The 3 "C" Design Model for Networked Collaborative E-Learning: A Tool for Novice Designers

    ERIC Educational Resources Information Center

    Bird, Len

    2007-01-01

    This paper outlines a model for online course design aimed at the mainstream majority of university academics rather than at the early adopters of technology. It has been developed from work at Coventry Business School where tutors have been called upon to design online modules for the first time. Like many good tools, the model's key strength is…

  12. [EEQ in clinical embryology: a starting program].

    PubMed

    Boyer, Pierre; Brugnon, Florence; Levy, Rachel; Pfeffer, Jérôme; Siest, Jean-Pascal

    2014-01-01

    Every laboratory including those working in assisted reproductive technologies have to be accredited EN ISO 15189 before 2020. This standardisation includes an external quality evaluation (EQE). In order to work out an EQE tool, we used images extracted from our own database developed during daily practice. We achieved an easily online tool called: "EEQ en embryologie clinique", developed on Biologie prospective web site with ART French biologists Association (Blefco) expertise in evaluation of early human embryonic stages. In 2013, 38 ART laboratories participate to the first program with more than 90% of appropriates results. The present article aims at describing this tool and discussing its limits.

  13. Development of a knowledge management system for complex domains.

    PubMed

    Perott, André; Schader, Nils; Bruder, Ralph; Leonhardt, Jörg

    2012-01-01

    Deutsche Flugsicherung GmbH, the German Air Navigation Service Provider, follows a systematic approach, called HERA, for investigating incidents. The HERA analysis shows a distinctive occurrence of incidents in German air traffic control in which the visual perception of information plays a key role. The reasons can be partially traced back to workstation design, where basic ergonomic rules and principles are not sufficiently followed by the designers in some cases. In cooperation with the Institute of Ergonomics in Darmstadt the DFS investigated possible approaches that may support designers to implement ergonomic systems. None of the currently available tools were found to be able to meet the identified user requirements holistically. Therefore it was suggested to develop an enhanced software tool called Design Process Guide. The name Design Process Guide indicates that this tool exceeds the classic functions of currently available Knowledge Management Systems. It offers "design element" based access, shows processual and content related topics, and shows the implications of certain design decisions. Furthermore, it serves as documentation, detailing why a designer made to a decision under a particular set of conditions.

  14. THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    A toolkit for distributed hydrologic modeling at multiple scales using a geographic information system is presented. This open-source, freely available software was developed through a collaborative endeavor involving two Universities and two government agencies. Called the Auto...

  15. The Early Development Instrument: Translating School Readiness Assessment into Community Actions and Policy Planning

    ERIC Educational Resources Information Center

    Guhn, Martin; Janus, Magdalena; Hertzman, Clyde

    2007-01-01

    This invited special issue of "Early Education and Development" presents research related to the Early Development Instrument (EDI; Janus & Offord, 2007), a community tool to assess children's school readiness at a population level. In this editorial introduction, we first sketch out recent trends in school readiness research that call for a…

  16. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  17. Performance Support Tools: Delivering Value when and where It Is Needed

    ERIC Educational Resources Information Center

    McManus, Paul; Rossett, Allison

    2006-01-01

    Some call them Electronic Performance Support Systems (EPSSs). Others prefer Performance Support Tools (PSTs) or decision support tools. One might call EPSSs or PSTs job aids on steroids, technological tools that provide critical information or advice needed to move forward at a particular moment in time. Characteristic advantages of an EPSS or a…

  18. Site specific passive acoustic detection and densities of humpback whale calls off the coast of California

    NASA Astrophysics Data System (ADS)

    Helble, Tyler Adam

    Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. Automated methods are needed to aid in the analyses of the recorded data. When a mammal vocalizes in the marine environment, the received signal is a filtered version of the original waveform emitted by the marine mammal. The waveform is reduced in amplitude and distorted due to propagation effects that are influenced by the bathymetry and environment. It is important to account for these effects to determine a site-specific probability of detection for marine mammal calls in a given study area. A knowledge of that probability function over a range of environmental and ocean noise conditions allows vocalization statistics from recordings of single, fixed, omnidirectional sensors to be compared across sensors and at the same sensor over time with less bias and uncertainty in the results than direct comparison of the raw statistics. This dissertation focuses on both the development of new tools needed to automatically detect humpback whale vocalizations from single-fixed omnidirectional sensors as well as the determination of the site-specific probability of detection for monitoring sites off the coast of California. Using these tools, detected humpback calls are "calibrated" for environmental properties using the site-specific probability of detection values, and presented as call densities (calls per square kilometer per time). A two-year monitoring effort using these calibrated call densities reveals important biological and ecological information on migrating humpback whales off the coast of California. Call density trends are compared between the monitoring sites and at the same monitoring site over time. Call densities also are compared to several natural and human-influenced variables including season, time of day, lunar illumination, and ocean noise. The results reveal substantial differences in call densities between the two sites which were not noticeable using uncorrected (raw) call counts. Additionally, a Lombard effect was observed for humpback whale vocalizations in response to increasing ocean noise. The results presented in this thesis develop techniques to accurately measure marine mammal abundances from passive acoustic sensors.

  19. Tool for Movable Ceiling

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Bendix Corp., with the help of NASA's Kennedy Space Center, developed a tool to equalize tensions in the 150 cables of the ceiling. This inexpensive tool used in concert halls was developed first for elevator and crane cables used to lift heavy space vehicles. University of Akron's performing arts hall has been developed to shrink and expand to accommodate audiences as large as 3,000 and as small as 900. Once the hall has been sound tuned, various positions of this ingenious ceiling and related acoustic curtains may be called into play immediately by pushing buttons on a control console programmed previously. With the touch of a finger before an event, a technician may condition the hall for chamber music, symphony, or theater.

  20. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    NASA Astrophysics Data System (ADS)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  1. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  2. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.

  3. Deriving stellar parameters with the SME software package

    NASA Astrophysics Data System (ADS)

    Piskunov, N.

    2017-09-01

    Photometry and spectroscopy are complementary tools for deriving accurate stellar parameters. Here I present one of the popular packages for stellar spectroscopy called SME with the emphasis on the latest developments and error assessment for the derived parameters.

  4. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in an increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. The Automated Reasoning Tool-Ada (ART-Ada), an Ada expert system tool, is explained. ART-Ada allows applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom program and the U.S. Air Force.

  5. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1991-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. Automated Reasoning Tool (ART) Ada, an Ada Expert system tool is described. ART-Ada allow applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  6. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  7. On-line resources for bacterial micro-evolution studies using MLVA or CRISPR typing.

    PubMed

    Grissa, Ibtissem; Bouchon, Patrick; Pourcel, Christine; Vergnaud, Gilles

    2008-04-01

    The control of bacterial pathogens requires the development of tools allowing the precise identification of strains at the subspecies level. It is now widely accepted that these tools will need to be DNA-based assays (in contrast to identification at the species level, where biochemical based assays are still widely used, even though very powerful 16S DNA sequence databases exist). Typing assays need to be cheap and amenable to the designing of international databases. The success of such subspecies typing tools will eventually be measured by the size of the associated reference databases accessible over the internet. Three methods have shown some potential in this direction, the so-called spoligotyping assay (Mycobacterium tuberculosis, 40,000 entries database), Multiple Loci Sequence Typing (MLST; up to a few thousands entries for the more than 20 bacterial species), and more recently Multiple Loci VNTR Analysis (MLVA; up to a few hundred entries, assays available for more than 20 pathogens). In the present report we will review the current status of the tools and resources we have developed along the past seven years to help in the setting-up or the use of MLVA assays or lately for analysing Clustered Regularly Interspaced Short Palindromic Repeats called CRISPRs which are the basis for spoligotyping assays.

  8. Career Counselling Development: A Case Study of an Innovative Career Counselling Tool

    ERIC Educational Resources Information Center

    Papakota, Aikaterini

    2016-01-01

    Promoting the use of new technologies in the career counselling process, the Career Services Office of the Aristotle University of Thessaloniki has developed an easy-to-use career counselling guide containing multimedia applications. The purpose of this career guide, called "Career Counseling@Career Office of Aristotle University of…

  9. Learning Mathematics by Designing, Programming, and Investigating with Interactive, Dynamic Computer-Based Objects

    ERIC Educational Resources Information Center

    Marshall, Neil; Buteau, Chantal

    2014-01-01

    As part of their undergraduate mathematics curriculum, students at Brock University learn to create and use computer-based tools with dynamic, visual interfaces, called Exploratory Objects, developed for the purpose of conducting pure or applied mathematical investigations. A student's Development Process Model of creating and using an Exploratory…

  10. Competency-Based Education in Three Pilot Programs: What It Is, How It's Implemented, and How It's Working. Brief

    ERIC Educational Resources Information Center

    Steele, Jennifer L.; Lewis, Matthew W.; Santibanez, Lucrecia; Faxon-Mills, Susannah; Rudnick, Mollie; Stecher, Brian M.; Hamilton, Laura S.

    2014-01-01

    In 2011, the Bill & Melinda Gates Foundation extended grants to three educational organizations working to develop or enhance competency-based approaches in large, urbanized school systems. The grant initiative, called Project Mastery, funded the development of technology-enhanced tools, including curriculum materials and online learning…

  11. The Development of a Web-Based Virtual Environment for Teaching Qualitative Analysis of Structures

    ERIC Educational Resources Information Center

    O'Dwyer, D. W.; Logan-Phelan, T. M.; O'Neill, E. A.

    2007-01-01

    The current paper describes the design and development of a qualitative analysis course and an interactive web-based teaching and assessment tool called VSE (virtual structural environment). The widespread reliance on structural analysis programs requires engineers to be able to verify computer output by carrying out qualitative analyses.…

  12. The Effects of Computer Assisted Mediating Prompts on EFL Learners' Writing Ability

    ERIC Educational Resources Information Center

    Damavandi, Zahra Mousazadeh; Hassaskhah, Jaleh; Zafarghandi, Amir Mahdavi

    2018-01-01

    This study aims to examine the EFL learners' perception and process of writing development through using a digital storytelling tool, called "Storyjumper." To do so, 15 intermediate-level students were participants of the study. The participants' writing development was frequently assessed through a series of repeated writing tests…

  13. Integration of e-Management, e-Development and e-Learning Technologies for Blended Course Delivery

    ERIC Educational Resources Information Center

    Johnson, Lynn E.; Tang, Michael

    2005-01-01

    This paper describes and assesses a pre-engineering curriculum development project called Foundations of Engineering, Science and Technology (FEST). FEST integrates web-based technologies into an inter-connected system to enable delivery of a blended program at multiple institutions. Tools and systems described include 1) technologies to deliver…

  14. Teachers as Learners: What Makes Technology-Focused Professional Development Effective?

    ERIC Educational Resources Information Center

    Curwood, Jen Scott

    2011-01-01

    Prompted by calls for research on technology-focused professional development, this article investigates how learning communities influence secondary English teachers' use of digital tools. Findings from this year-long study in the United States indicate that the ways in which technology is integrated within the English curriculum are still very…

  15. Teaching for Transformative Educational Experience in a Sport for Development Program

    ERIC Educational Resources Information Center

    Wright, Paul M.; Jacobs, Jenn M.; Ressler, James D.; Jung, Jinhong

    2016-01-01

    Despite the assumption that Sport for Development and Peace programs can foster social change, many fail to provide intentional educational experiences. This limits the attainment and sustainability of positive outcomes for participants and communities. The literature calls for such programs to use sport as an educational tool that shifts power to…

  16. Operator function modeling: An approach to cognitive task analysis in supervisory control systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1987-01-01

    In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).

  17. Development of a Spacecraft Materials Selector Expert System

    NASA Technical Reports Server (NTRS)

    Pippin, G.; Kauffman, W. (Technical Monitor)

    2002-01-01

    This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.

  18. Company profile: PGXIS Ltd.

    PubMed

    McCarthy, Alun

    2011-09-01

    Pharmacogenomic Innovative Solutions Ltd (PGXIS) was established in 2007 by a group of pharmacogenomic (PGx) experts to make their expertise available to biotechnology and pharmaceutical companies. PGXIS has subsequently established a network of experts to broaden its access to relevant PGx knowledge and technologies. In addition, it has developed a novel multivariate analysis method called Taxonomy3 which is both a data integration tool and a targeting tool. Together with siRNA methodology from CytoPathfinder Inc., PGXIS now has an extensive range of diverse PGx methodologies focused on enhancing drug development.

  19. Development of an After-Sales Support Inter-Enterprise Collaboration System Using Information Technologies

    NASA Astrophysics Data System (ADS)

    Kimura, Toshiaki; Kasai, Fumio; Kamio, Yoichi; Kanda, Yuichi

    This research paper discusses a manufacturing support system which supports not only maintenance services but also consulting services for manufacturing systems consisting of multi-vendor machine tools. In order to do this system enables inter-enterprise collaboration between engineering companies and machine tool vendors. The system is called "After-Sales Support Inter-enterprise collaboration System using information Technologies" (ASSIST). This paper describes the concept behind the planned ASSIST, the development of a prototype of the system, and discusses test operation results of the system.

  20. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD.

    PubMed

    Slok, Annerika H M; in 't Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, P N Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P

    2014-07-10

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice.

  1. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD

    PubMed Central

    Slok, Annerika H M; in ’t Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, PN Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P

    2014-01-01

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice. PMID:25010353

  2. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  3. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  4. Economic importance of transportation services : highlights of the Transportation Satellite Accounts

    DOT National Transportation Integrated Search

    1998-04-01

    A new accounting tool, called the Transportation Satellite Accounts (TSA), now provides a way to measure both in-house and for-hire transportation services. The TSA, developed jointly by the Bureau of Transportation Statistics (BTS) of the Department...

  5. Using Forensics to Untangle Batch Effects in TCGA Data - TCGA

    Cancer.gov

    Rehan Akbani, Ph.D., and colleagues at the University of Texas MD Anderson Cancer Center developed a tool called MBatch to detect, diagnose, and correct batch effects in TCGA data. Read more about batch effects in this Case Study.

  6. A new approach for developing adjoint models

    NASA Astrophysics Data System (ADS)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and supplies callbacks to compute the action of these operators. The library, called libadjoint, is then capable of symbolically manipulating the forward annotation to automatically assemble the adjoint equations. Libadjoint is open source, and is explicitly designed to be bolted-on to an existing discrete model. It can be applied to any discretisation, steady or time-dependent problems, and both linear and nonlinear systems. Using libadjoint has several advantages. It requires the application of an AD tool only to small pieces of code, making the use of AD far more tractable. As libadjoint derives the adjoint equations, the expertise required to develop an adjoint model is greatly diminished. One major advantage of this approach is that the model developer is freed from implementing complex checkpointing strategies for the adjoint model: libadjoint has sufficient information about the forward model to re-play the entire forward solve when necessary, and thus the checkpointing algorithm can be implemented entirely within the library itself. Examples are shown using the Fluidity/ICOM framework, a complex ocean model under development at Imperial College London.

  7. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  8. SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output

    PubMed Central

    Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.

    2011-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297

  9. SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†

    PubMed Central

    Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.

    2013-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136

  10. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    NASA Astrophysics Data System (ADS)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  11. Methods for examining data quality in healthcare integrated data repositories.

    PubMed

    Huser, Vojtech; Kahn, Michael G; Brown, Jeffrey S; Gouripeddi, Ramkiran

    2018-01-01

    This paper summarizes content of the workshop focused on data quality. The first speaker (VH) described data quality infrastructure and data quality evaluation methods currently in place within the Observational Data Science and Informatics (OHDSI) consortium. The speaker described in detail a data quality tool called Achilles Heel and latest development for extending this tool. Interim results of an ongoing Data Quality study within the OHDSI consortium were also presented. The second speaker (MK) described lessons learned and new data quality checks developed by the PEDsNet pediatric research network. The last two speakers (JB, RG) described tools developed by the Sentinel Initiative and University of Utah's service oriented framework. The workshop discussed at the end and throughout how data quality assessment can be advanced by combining best features of each network.

  12. Bibliometric indexes, databases and impact factors in cardiology

    PubMed Central

    Bienert, Igor R C; de Oliveira, Rogério Carvalho; de Andrade, Pedro Beraldo; Caramori, Carlos Antonio

    2015-01-01

    Bibliometry is a quantitative statistical technique to measure levels of production and dissemination of knowledge, as well as a useful tool to track the development of an scientific area. The valuation of production required for recognition of researchers and magazines is accomplished through tools called bibliometricindexes, divided into quality indicators and scientific impact. Initially developed for monographs of statistical measures especially in libraries, today bibliometrics is mainly used to evaluate productivity of authors and citation repercussion. However, these tools have limitations and sometimes provoke controversies about indiscriminate application, leading to the development of newer indexes. It is important to know the most common search indexes and use it properly even acknowledging its limitations as it has a direct impact in their daily practice, reputation and funds achievement. PMID:26107458

  13. Experiences with DCE: the pro7 communication server based on OSF-DCE functionality.

    PubMed

    Schulte, M; Lordieck, W

    1997-01-01

    The pro7-communication server is a new approach to manage communication between different applications on different hardware platforms in a hospital environment. The most important features are the use of OSF/DCE for realising remote procedure calls between different platforms, the use of an SQL-92 compatible relational database and the design of a new software development tool (called protocol definition language compiler) for describing the interface of a new application, which is to integrate in a hospital environment.

  14. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    NASA Technical Reports Server (NTRS)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  15. A Computer-Aided Abstracting Tool Kit.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    1993-01-01

    Reports on the development of a prototype computerized abstractor's assistant called TEXNET, a text network management system. Features of the system discussed include semantic dependency links; displays of text structure; basic text editing; extracting; weighting methods; and listings of frequent words. (Contains 25 references.) (LRW)

  16. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  17. Bat detective-Deep learning tools for bat acoustic signal detection.

    PubMed

    Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E

    2018-03-01

    Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.

  18. Bat detective—Deep learning tools for bat acoustic signal detection

    PubMed Central

    Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.

    2018-01-01

    Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076

  19. Reflective Journaling for Critical Thinking Development in Advanced Practice Registered Nurse Students.

    PubMed

    Raterink, Ginger

    2016-02-01

    Critical thinking, clinical decision making, and critical reflection have been identified as skills required of nurses in every clinical situation. The Educating Nurses: A Call for Radical Transformation report suggested that critical reflection is a key to improving the educational process. Reflective journaling is a tool that helps develop such skills. This article presents the tool of reflective journaling and the use of this process by educators working with students. It describes the use of reflective journaling in graduate nursing education, as well as a scoring process to evaluate the reflection and provide feedback. Students and faculty found the journaling to be helpful for reflection of a clinical situation focused on critical thinking skill development. The rubric scoring tool provided faculty with a method for feedback. Reflective journaling is a tool that faculty and students can use to develop critical thinking skills for the role of the advanced practice RN. A rubric scoring system offers a consistent format for feedback. Copyright 2016, SLACK Incorporated.

  20. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  1. The automatic back-check mechanism of mask tooling database and automatic transmission of mask tooling data

    NASA Astrophysics Data System (ADS)

    Xu, Zhe; Peng, M. G.; Tu, Lin Hsin; Lee, Cedric; Lin, J. K.; Jan, Jian Feng; Yin, Alb; Wang, Pei

    2006-10-01

    Nowadays, most foundries have paid more and more attention in order to reduce the CD width. Although the lithography technologies have developed drastically, mask data accuracy is still a big challenge than before. Besides, mask (reticle) price also goes up drastically such that data accuracy needs more special treatments.We've developed a system called eFDMS to guarantee the mask data accuracy. EFDMS is developed to do the automatic back-check of mask tooling database and the data transmission of mask tooling. We integrate our own EFDMS systems to engage with the standard mask tooling system K2 so that the upriver and the downriver processes of the mask tooling main body K2 can perform smoothly and correctly with anticipation. The competition in IC marketplace is changing from high-tech process to lower-price gradually. How to control the reduction of the products' cost more plays a significant role in foundries. Before the violent competition's drawing nearer, we should prepare the cost task ahead of time.

  2. Designing Specification Languages for Process Control Systems: Lessons Learned and Steps to the Future

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Heimdahl, Mats P. E.; Reese, Jon Damon

    1999-01-01

    Previously, we defined a blackbox formal system modeling language called RSML (Requirements State Machine Language). The language was developed over several years while specifying the system requirements for a collision avoidance system for commercial passenger aircraft. During the language development, we received continual feedback and evaluation by FAA employees and industry representatives, which helped us to produce a specification language that is easily learned and used by application experts. Since the completion of the PSML project, we have continued our research on specification languages. This research is part of a larger effort to investigate the more general problem of providing tools to assist in developing embedded systems. Our latest experimental toolset is called SpecTRM (Specification Tools and Requirements Methodology), and the formal specification language is SpecTRM-RL (SpecTRM Requirements Language). This paper describes what we have learned from our use of RSML and how those lessons were applied to the design of SpecTRM-RL. We discuss our goals for SpecTRM-RL and the design features that support each of these goals.

  3. The Creation of a CPU Timer for High Fidelity Programs

    NASA Technical Reports Server (NTRS)

    Dick, Aidan A.

    2011-01-01

    Using C and C++ programming languages, a tool was developed that measures the efficiency of a program by recording the amount of CPU time that various functions consume. By inserting the tool between lines of code in the program, one can receive a detailed report of the absolute and relative time consumption associated with each section. After adapting the generic tool for a high-fidelity launch vehicle simulation program called MAVERIC, the components of a frequently used function called "derivatives ( )" were measured. Out of the 34 sub-functions in "derivatives ( )", it was found that the top 8 sub-functions made up 83.1% of the total time spent. In order to decrease the overall run time of MAVERIC, a launch vehicle simulation program, a change was implemented in the sub-function "Event_Controller ( )". Reformatting "Event_Controller ( )" led to a 36.9% decrease in the total CPU time spent by that sub-function, and a 3.2% decrease in the total CPU time spent by the overarching function "derivatives ( )".

  4. MutAid: Sanger and NGS Based Integrated Pipeline for Mutation Identification, Validation and Annotation in Human Molecular Genetics.

    PubMed

    Pandey, Ram Vinay; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Traditional Sanger sequencing as well as Next-Generation Sequencing have been used for the identification of disease causing mutations in human molecular research. The majority of currently available tools are developed for research and explorative purposes and often do not provide a complete, efficient, one-stop solution. As the focus of currently developed tools is mainly on NGS data analysis, no integrative solution for the analysis of Sanger data is provided and consequently a one-stop solution to analyze reads from both sequencing platforms is not available. We have therefore developed a new pipeline called MutAid to analyze and interpret raw sequencing data produced by Sanger or several NGS sequencing platforms. It performs format conversion, base calling, quality trimming, filtering, read mapping, variant calling, variant annotation and analysis of Sanger and NGS data under a single platform. It is capable of analyzing reads from multiple patients in a single run to create a list of potential disease causing base substitutions as well as insertions and deletions. MutAid has been developed for expert and non-expert users and supports four sequencing platforms including Sanger, Illumina, 454 and Ion Torrent. Furthermore, for NGS data analysis, five read mappers including BWA, TMAP, Bowtie, Bowtie2 and GSNAP and four variant callers including GATK-HaplotypeCaller, SAMTOOLS, Freebayes and VarScan2 pipelines are supported. MutAid is freely available at https://sourceforge.net/projects/mutaid.

  5. MutAid: Sanger and NGS Based Integrated Pipeline for Mutation Identification, Validation and Annotation in Human Molecular Genetics

    PubMed Central

    Pandey, Ram Vinay; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Traditional Sanger sequencing as well as Next-Generation Sequencing have been used for the identification of disease causing mutations in human molecular research. The majority of currently available tools are developed for research and explorative purposes and often do not provide a complete, efficient, one-stop solution. As the focus of currently developed tools is mainly on NGS data analysis, no integrative solution for the analysis of Sanger data is provided and consequently a one-stop solution to analyze reads from both sequencing platforms is not available. We have therefore developed a new pipeline called MutAid to analyze and interpret raw sequencing data produced by Sanger or several NGS sequencing platforms. It performs format conversion, base calling, quality trimming, filtering, read mapping, variant calling, variant annotation and analysis of Sanger and NGS data under a single platform. It is capable of analyzing reads from multiple patients in a single run to create a list of potential disease causing base substitutions as well as insertions and deletions. MutAid has been developed for expert and non-expert users and supports four sequencing platforms including Sanger, Illumina, 454 and Ion Torrent. Furthermore, for NGS data analysis, five read mappers including BWA, TMAP, Bowtie, Bowtie2 and GSNAP and four variant callers including GATK-HaplotypeCaller, SAMTOOLS, Freebayes and VarScan2 pipelines are supported. MutAid is freely available at https://sourceforge.net/projects/mutaid. PMID:26840129

  6. Development of a selection tool for use in the identification, recruitment & retention of safe intermodal transportation workers.

    DOT National Transportation Integrated Search

    2012-05-01

    A total of 486 transportation employees employed by a major railroad completed a series of : tests constructed for this project. These tests or instruments included the Denver Lifestyle : Questionnaire, a performance rating scale called the Employees...

  7. Paradigms, Citations, and Maps of Science: A Personal History.

    ERIC Educational Resources Information Center

    Small, Henry

    2003-01-01

    Discusses mapping science and Kuhn's theories of paradigms and scientific development. Highlights include cocitation clustering; bibliometric definition of a paradigm; specialty dynamics; pathways through science; a new Web tool called Essential Science Indicators (ESI) for studying the structure of science; and microrevolutions. (Author/LRW)

  8. Clinical Assessment in Mathematics: Learning the Craft.

    ERIC Educational Resources Information Center

    Hunting, Robert P.; Doig, Brian A.

    1997-01-01

    Discusses a professional development program called Clinical Approaches to Mathematics Assessment. Argues for the advanced training of mathematics teachers who understand knowledge construction processes of students; can use clinical tools for evaluating a student's unique mathematical "fingerprint"; and can create or adapt problems, tasks, or…

  9. RATT: Rapid Annotation Transfer Tool

    PubMed Central

    Otto, Thomas D.; Dillon, Gary P.; Degrave, Wim S.; Berriman, Matthew

    2011-01-01

    Second-generation sequencing technologies have made large-scale sequencing projects commonplace. However, making use of these datasets often requires gene function to be ascribed genome wide. Although tool development has kept pace with the changes in sequence production, for tasks such as mapping, de novo assembly or visualization, genome annotation remains a challenge. We have developed a method to rapidly provide accurate annotation for new genomes using previously annotated genomes as a reference. The method, implemented in a tool called RATT (Rapid Annotation Transfer Tool), transfers annotations from a high-quality reference to a new genome on the basis of conserved synteny. We demonstrate that a Mycobacterium tuberculosis genome or a single 2.5 Mb chromosome from a malaria parasite can be annotated in less than five minutes with only modest computational resources. RATT is available at http://ratt.sourceforge.net. PMID:21306991

  10. Lakeside: Merging Urban Design with Scientific Analysis

    ScienceCinema

    Guzowski, Leah; Catlett, Charlie; Woodbury, Ed

    2018-01-16

    Researchers at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago are developing tools that merge urban design with scientific analysis to improve the decision-making process associated with large-scale urban developments. One such tool, called LakeSim, has been prototyped with an initial focus on consumer-driven energy and transportation demand, through a partnership with the Chicago-based architectural and engineering design firm Skidmore, Owings & Merrill, Clean Energy Trust and developer McCaffery Interests. LakeSim began with the need to answer practical questions about urban design and planning, requiring a better understanding about the long-term impact of design decisions on energy and transportation demand for a 600-acre development project on Chicago's South Side - the Chicago Lakeside Development project.

  11. SNV-PPILP: refined SNV calling for tumor data using perfect phylogenies and ILP.

    PubMed

    van Rens, Karen E; Mäkinen, Veli; Tomescu, Alexandru I

    2015-04-01

    Recent studies sequenced tumor samples from the same progenitor at different development stages and showed that by taking into account the phylogeny of this development, single-nucleotide variant (SNV) calling can be improved. Accurate SNV calls can better reveal early-stage tumors, identify mechanisms of cancer progression or help in drug targeting. We present SNV-PPILP, a fast and easy to use tool for refining GATK's Unified Genotyper SNV calls, for multiple samples assumed to form a phylogeny. We tested SNV-PPILP on simulated data, with a varying number of samples, SNVs, read coverage and violations of the perfect phylogeny assumption. We always match or improve the accuracy of GATK, with a significant improvement on low read coverage. SNV-PPILP, available at cs.helsinki.fi/gsa/snv-ppilp/, is written in Python and requires the free ILP solver lp_solve. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Comparative Investigation on Tool Wear during End Milling of AISI H13 Steel with Different Tool Path Strategies

    NASA Astrophysics Data System (ADS)

    Adesta, Erry Yulian T.; Riza, Muhammad; Avicena

    2018-03-01

    Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.

  13. DaRT: A CALL System to Help Students Practice and Develop Reasoning Skills in Choosing English Articles.

    ERIC Educational Resources Information Center

    Yoshii, Rika; Milne, Alastair

    1998-01-01

    Describes DaRT, a computer assisted language-learning system for helping English-as-a-Second-Language students master English articles. DaRT uses a diagrammatic reasoning tool to present communicative contexts for exercises in choosing appropriate articles. This paper describes the development of DaRT and DaRT's system components and concludes…

  14. Screening for Autism in Iranian Preschoolers: Contrasting M-CHAT and a Scale Developed in Iran

    ERIC Educational Resources Information Center

    Samadi, Sayyed Ali; McConkey, Roy

    2015-01-01

    Suitable screening instruments for the early diagnosis of autism are not readily available for use with preschoolers in non-Western countries. This study evaluated two tools: M-CHAT which is widely used internationally and one developed in Iran called Hiva. A population sample was recruited of nearly 3000 preschoolers in one Iranian city. Parents…

  15. Developing a Technological Pedagogical Content Knowledge (TPACK) Assessment for Preservice Teachers Learning to Teach English as a Foreign Language

    ERIC Educational Resources Information Center

    Baser, Derya; Kopcha, Theodore J.; Ozden, M. Yasar

    2016-01-01

    This paper reports the development and validation process of a self-assessment survey that examines technological pedagogical content knowledge (TPACK) among preservice teachers learning to teach English as a foreign language (EFL). The survey, called TPACK-EFL, aims to provide an assessment tool for preservice foreign language teachers that…

  16. Tool for Merging Proposals Into DSN Schedules

    NASA Technical Reports Server (NTRS)

    Khanampornpan, Teerapat; Kwok, John; Call, Jared

    2008-01-01

    A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.

  17. Meta-Assessment in a Project-Based Systems Engineering Course

    ERIC Educational Resources Information Center

    Wengrowicz, Niva; Dori, Yehudit Judy; Dori, Dov

    2017-01-01

    Project-based learning (PBL) facilitates significant learning, but it poses a major assessment challenge for assessing individual content knowledge. We developed and implemented an assessment approach and tool for a mandatory undergraduate systems engineering PBL-based course. We call this type of assessment "student-oriented"…

  18. Validating Automated Measures of Text Complexity

    ERIC Educational Resources Information Center

    Sheehan, Kathleen M.

    2017-01-01

    Automated text complexity measurement tools (also called readability metrics) have been proposed as a way to help teachers, textbook publishers, and assessment developers select texts that are closely aligned with the new, more demanding text complexity expectations specified in the Common Core State Standards. This article examines a critical…

  19. Learning Progressions as Tools for Assessment and Learning

    ERIC Educational Resources Information Center

    Shepard, Lorrie A.

    2018-01-01

    This article addresses the teaching and learning side of the learning progressions literature, calling out for measurement specialists the knowledge most needed when collaborating with subject-matter experts in the development of learning progressions. Learning progressions are one of the strongest instantiations of principles from "Knowing…

  20. Early postnatal weight gain as a predictor for the development of retinopathy of prematurity.

    PubMed

    Biniwale, Manoj; Weiner, Angela; Sardesai, Smeeta; Cayabyab, Rowena; Barton, Lorayne; Ramanathan, Rangasamy

    2017-10-01

    The objective of this study is to validate the reliability of early postnatal weight gain as an accurate predictor of type 1 retinopathy of prematurity (ROP) requiring treatment in a large predominantly Hispanic US cohort with the use of an online tool called WINROP (weight, neonatal retinopathy of prematurity (IGF-1), neonatal retinopathy of prematurity). Retrospective cohort study consisted of preterm infants <32 weeks gestation and birth weight <1500 g. Weekly weights to 36 weeks post-menstrual age or discharge if earlier were entered into the WINROP tool. This tool generated alarm and risk indicator for developing ROP. The infants with type 1 ROP requiring treatment as well as all stages of ROP were compared with the alarms and risks generated by WINROP tool. A total of 492 infants were entered into the WINROP tool. The infants who developed type 1 ROP requiring treatment, the WINROP tool detected 80/89 (90%) at less than 32 weeks gestation. Nine infants developed type 1 ROP were classified as low risk and did not alarm. Postnatal weight gain alone, in predominantly Hispanic US population, predicted type 1 ROP requiring treatment before 32 weeks of gestation in infants with a sensitivity of 90%. The tool appeared to identify majority of affected infants much earlier than the scheduled screening.

  1. Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort

    PubMed Central

    Gambin, Tomasz; Akdemir, Zeynep C.; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M.B.; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M.; Eldomery, Mohammad K.; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W.; Boerwinkle, Eric; Beaudet, Arthur L.; Gibbs, Richard A.

    2017-01-01

    Abstract We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor–Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17–50% of pathogenic CNVs in different disease cohorts where 7.1–11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. PMID:27980096

  2. Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data

    NASA Astrophysics Data System (ADS)

    Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin

    2017-02-01

    Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.

  3. Actualities and Development of Heavy-Duty CNC Machine Tool Thermal Error Monitoring Technology

    NASA Astrophysics Data System (ADS)

    Zhou, Zu-De; Gui, Lin; Tan, Yue-Gang; Liu, Ming-Yao; Liu, Yi; Li, Rui-Ya

    2017-09-01

    Thermal error monitoring technology is the key technological support to solve the thermal error problem of heavy-duty CNC (computer numerical control) machine tools. Currently, there are many review literatures introducing the thermal error research of CNC machine tools, but those mainly focus on the thermal issues in small and medium-sized CNC machine tools and seldom introduce thermal error monitoring technologies. This paper gives an overview of the research on the thermal error of CNC machine tools and emphasizes the study of thermal error of the heavy-duty CNC machine tool in three areas. These areas are the causes of thermal error of heavy-duty CNC machine tool and the issues with the temperature monitoring technology and thermal deformation monitoring technology. A new optical measurement technology called the "fiber Bragg grating (FBG) distributed sensing technology" for heavy-duty CNC machine tools is introduced in detail. This technology forms an intelligent sensing and monitoring system for heavy-duty CNC machine tools. This paper fills in the blank of this kind of review articles to guide the development of this industry field and opens up new areas of research on the heavy-duty CNC machine tool thermal error.

  4. Epidemiology of transmissible diseases: Array hybridization and next generation sequencing as universal nucleic acid-mediated typing tools.

    PubMed

    Michael Dunne, W; Pouseele, Hannes; Monecke, Stefan; Ehricht, Ralf; van Belkum, Alex

    2017-09-21

    The magnitude of interest in the epidemiology of transmissible human diseases is reflected in the vast number of tools and methods developed recently with the expressed purpose to characterize and track evolutionary changes that occur in agents of these diseases over time. Within the past decade a new suite of such tools has become available with the emergence of the so-called "omics" technologies. Among these, two are exponents of the ongoing genomic revolution. Firstly, high-density nucleic acid probe arrays have been proposed and developed using various chemical and physical approaches. Via hybridization-mediated detection of entire genes or genetic polymorphisms in such genes and intergenic regions these so called "DNA chips" have been successfully applied for distinguishing very closely related microbial species and strains. Second and even more phenomenal, next generation sequencing (NGS) has facilitated the assessment of the complete nucleotide sequence of entire microbial genomes. This technology currently provides the most detailed level of bacterial genotyping and hence allows for the resolution of microbial spread and short-term evolution in minute detail. We will here review the very recent history of these two technologies, sketch their usefulness in the elucidation of the spread and epidemiology of mostly hospital-acquired infections and discuss future developments. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. TRC research products: Components for service robots

    NASA Technical Reports Server (NTRS)

    Lob, W. Stuart

    1994-01-01

    Transitions Research Corporation has developed a variety of technologies to accomplish its central mission: the creation of commercially viable robots for the service industry. Collectively, these technologies comprise the TRC 'robot tool kit.' The company started by developing a robot base that serves as a foundation for mobile robot research and development, both within TRC and at customer sites around the world. A diverse collection of sensing techniques evolved more recently, many of which have been made available to the international mobile robot research community as commercial products. These 'tool-kit' research products are described in this paper. The largest component of TRC's commercial operation is a product called HelpMate for material transport and delivery in health care institutions.

  6. compomics-utilities: an open-source Java library for computational proteomics.

    PubMed

    Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart

    2011-03-08

    The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.

  7. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. The National energy modeling system

    NASA Astrophysics Data System (ADS)

    The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.

  9. Newcastle Disease: Progress and gaps in the development of vaccines and diagnostic tools

    USDA-ARS?s Scientific Manuscript database

    Newcastle disease (ND) is a contagious disease of birds that can have severe economic consequences for any poultry producer, including a serious impact on the international trade of poultry and eggs. Newcastle disease virus (NDV) isolates are also called avian paramyxovirus serotype-1 isolates, but ...

  10. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  11. Shaker Oats: Fortifying Musicality

    ERIC Educational Resources Information Center

    Semmes, Laurie R.

    2010-01-01

    In this article, the author describes how an experiment in a class she taught called Minority Musics of North America developed into a surprisingly successful and flexible teaching tool known as "Shaker Oats," created to encourage the concepts of ensemble and community. Most music educators in the United States today are familiar with…

  12. Computer-Assisted Language Learning Authoring Issues

    ERIC Educational Resources Information Center

    Otto, Sue E. K.; Pusack, James P.

    2009-01-01

    Computer-assisted language learning (CALL) authoring refers to a wide variety of creative development activities using software tools that run the gamut from simple templates (easy-to-use predefined forms into which content is typed) to complex authoring environments (flexible but harder-to-use systems, requiring advanced skills and a great deal…

  13. Epistemologies in Practice: Making Scientific Practices Meaningful for Students

    ERIC Educational Resources Information Center

    Berland, Leema K.; Schwarz, Christina V.; Krist, Christina; Kenyon, Lisa; Lo, Abraham S.; Reiser, Brian J.

    2016-01-01

    Recent research and policy documents call for engaging students and teachers in scientific practices such that the goal of science education shifts from students "knowing" scientific and epistemic ideas, to students "developing and using" these understandings as tools to make sense of the world. This perspective pushes students…

  14. Identification of immunoglobulins using Chou's pseudo amino acid composition with feature selection technique.

    PubMed

    Tang, Hua; Chen, Wei; Lin, Hao

    2016-04-01

    Immunoglobulins, also called antibodies, are a group of cell surface proteins which are produced by the immune system in response to the presence of a foreign substance (called antigen). They play key roles in many medical, diagnostic and biotechnological applications. Correct identification of immunoglobulins is crucial to the comprehension of humoral immune function. With the avalanche of protein sequences identified in postgenomic age, it is highly desirable to develop computational methods to timely identify immunoglobulins. In view of this, we designed a predictor called "IGPred" by formulating protein sequences with the pseudo amino acid composition into which nine physiochemical properties of amino acids were incorporated. Jackknife cross-validated results showed that 96.3% of immunoglobulins and 97.5% of non-immunoglobulins can be correctly predicted, indicating that IGPred holds very high potential to become a useful tool for antibody analysis. For the convenience of most experimental scientists, a web-server for IGPred was established at http://lin.uestc.edu.cn/server/IGPred. We believe that the web-server will become a powerful tool to study immunoglobulins and to guide related experimental validations.

  15. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  16. Impact and User Satisfaction of a Clinical Information Portal Embedded in an Electronic Health Record

    PubMed Central

    Tannery, Nancy H; Epstein, Barbara A; Wessel, Charles B; Yarger, Frances; LaDue, John; Klem, Mary Lou

    2011-01-01

    In 2008, a clinical information tool was developed and embedded in the electronic health record system of an academic medical center. In 2009, the initial information tool, Clinical-e, was superseded by a portal called Clinical Focus, with a single search box enabling a federated search of selected online information resources. To measure the usefulness and impact of Clinical Focus, a survey was used to gather feedback about users' experience with this clinical resource. The survey determined what type of clinicians were using this tool and assessed user satisfaction and perceived impact on patient care decision making. Initial survey results suggest the majority of respondents found Clinical Focus easy to navigate, the content easy to read, and the retrieved information relevant and complete. The majority would recommend Clinical Focus to their colleagues. Results indicate that this tool is a promising area for future development. PMID:22016670

  17. Design of a self-administered online food frequency questionnaire (FFQ) to assess dietary intake among university population.

    PubMed

    González Carrascosa, R; Bayo Montó, J L; Meneu Barreira, T; García Segovia, P; Martínez-Monzó, J

    2011-01-01

    To introduce and describe a new tool called UPV-FFQ to evaluate dietary intake of the university population. The new tool consists principally in a self-administered online food frequency questionnaire (FFQ). The tool UPV-FFQ has been developed by means of web pages applying the technology ASP.NET 2.0 and using the database SQL Server 2005 as support. To develop the FFQ has been used as model the paper and pencil FFQ called "Dieta, salud y antropometría en la población universitaria". The tool has three parts: (1) a homepage, (2) a general questionnaire and (3) a FFQ. The FFQ has a closed list of 84 food items commonly consumed in Valencia region. The respondents has to indicate the food items that they consume (2 possible options), the frequency of consumption (9 response options) and the quantity consumed (7 response options). The UPV-FFQ has approximately 250 color photographs that represents to three portion sizes. The photographs are useful to help the respondents to choose the portion sizes that more adjusts to their habitual portions. The new tool provides quantitative information of the habitual intake of 31 nutritional parameters and provides qualitative information of the general questionnaire. A pilot study was done for a total of 57 respondents. The media time spend to fill in was 15 minutes. The pilot study concluded that the questionnaire was ease-of-use, low cost and time-effectiveness questionnaire. The format and the sequence of the questions were easily understood.

  18. CoVaCS: a consensus variant calling system.

    PubMed

    Chiara, Matteo; Gioiosa, Silvia; Chillemi, Giovanni; D'Antonio, Mattia; Flati, Tiziano; Picardi, Ernesto; Zambelli, Federico; Horner, David Stephen; Pesole, Graziano; Castrignanò, Tiziana

    2018-02-05

    The advent and ongoing development of next generation sequencing technologies (NGS) has led to a rapid increase in the rate of human genome re-sequencing data, paving the way for personalized genomics and precision medicine. The body of genome resequencing data is progressively increasing underlining the need for accurate and time-effective bioinformatics systems for genotyping - a crucial prerequisite for identification of candidate causal mutations in diagnostic screens. Here we present CoVaCS, a fully automated, highly accurate system with a web based graphical interface for genotyping and variant annotation. Extensive tests on a gold standard benchmark data-set -the NA12878 Illumina platinum genome- confirm that call-sets based on our consensus strategy are completely in line with those attained by similar command line based approaches, and far more accurate than call-sets from any individual tool. Importantly our system exhibits better sensitivity and higher specificity than equivalent commercial software. CoVaCS offers optimized pipelines integrating state of the art tools for variant calling and annotation for whole genome sequencing (WGS), whole-exome sequencing (WES) and target-gene sequencing (TGS) data. The system is currently hosted at Cineca, and offers the speed of a HPC computing facility, a crucial consideration when large numbers of samples must be analysed. Importantly, all the analyses are performed automatically allowing high reproducibility of the results. As such, we believe that CoVaCS can be a valuable tool for the analysis of human genome resequencing studies. CoVaCS is available at: https://bioinformatics.cineca.it/covacs .

  19. Development of Medical Technology for Contingency Response to Marrow Toxic Agents

    DTIC Science & Technology

    2018-06-06

    Health Physics e. Emergency Medicine f. Burn Care g. State Public Health h. Federal Public Health i. Emergency Management. 2. The group has...Preparedness 4 Project: Local Public Health Radiological Preparedness Gap Review and Tool Development Identification 1. The National Association...of County and City Health Officials (NACCHO) has held multiple conference calls with leaders within their organization to identify the areas of

  20. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  1. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  2. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  3. Multimodal visualization interface for data management, self-learning and data presentation.

    PubMed

    Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M

    2006-10-01

    A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.

  4. Problem solving with genetic algorithms and Splicer

    NASA Technical Reports Server (NTRS)

    Bayer, Steven E.; Wang, Lui

    1991-01-01

    Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.

  5. Biotool2Web: creating simple Web interfaces for bioinformatics applications.

    PubMed

    Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg

    2006-01-01

    Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).

  6. An overview of new video coding tools under consideration for VP10: the successor to VP9

    NASA Astrophysics Data System (ADS)

    Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu

    2015-09-01

    Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.

  7. Call for Standardized Definitions of Osteoarthritis and Risk Stratification for Clinical Trials and Clinical Use

    PubMed Central

    Kraus, Virginia Byers; Blanco, Francisco J.; Englund, Martin; Karsdal, Morten A.; Lohmander, L. Stefan

    2015-01-01

    Osteoarthritis is a heterogeneous disorder. The goals of this review are (1) To stimulate use of standardized nomenclature for osteoarthritis (OA) that could serve as building blocks for describing OA and defining OA phenotypes, in short to provide unifying disease concepts for a heterogeneous disorder; and (2) To stimulate establishment of ROAD (Risk of Osteoarthritis Development) and ROAP (Risk of Osteoarthritis Progression) tools analogous to the FRAX™ instrument for predicting risk of fracture in osteoporosis; and (3) To stimulate formulation of tools for identifying disease in its early preradiographic and/or molecular stages -- REDI (Reliable Early Disease Identification). Consensus around more sensitive and specific diagnostic criteria for OA could spur development of disease modifying therapies for this entity that has proved so recalcitrant to date. We fully acknowledge that as we move forward, we expect to develop more sophisticated definitions, terminology and tools. PMID:25865392

  8. Global Impact Estimation of ISO 50001 Energy Management System for Industrial and Service Sectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghajanzadeh, Arian; Therkelsen, Peter L.; Rao, Prakash

    A methodology has been developed to determine the impacts of ISO 50001 Energy Management System (EnMS) at a region or country level. The impacts of ISO 50001 EnMS include energy, CO2 emissions, and cost savings. This internationally recognized and transparent methodology has been embodied in a user friendly Microsoft Excel® based tool called ISO 50001 Impact Estimator Tool (IET 50001). However, the tool inputs are critical in order to get accurate and defensible results. This report is intended to document the data sources used and assumptions made to calculate the global impact of ISO 50001 EnMS.

  9. Getting the Best out of Excel

    ERIC Educational Resources Information Center

    Heys, Chris

    2008-01-01

    Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.

  10. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  11. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  12. Standardized data sharing in a paediatric oncology research network--a proof-of-concept study.

    PubMed

    Hochedlinger, Nina; Nitzlnader, Michael; Falgenhauer, Markus; Welte, Stefan; Hayn, Dieter; Koumakis, Lefteris; Potamias, George; Tsiknakis, Manolis; Saraceno, Davide; Rinaldi, Eugenia; Ladenstein, Ruth; Schreier, Günter

    2015-01-01

    Data that has been collected in the course of clinical trials are potentially valuable for additional scientific research questions in so called secondary use scenarios. This is of particular importance in rare disease areas like paediatric oncology. If data from several research projects need to be connected, so called Core Datasets can be used to define which information needs to be extracted from every involved source system. In this work, the utility of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) as a format for Core Datasets was evaluated and a web tool was developed which received Source ODM XML files and--via Extensible Stylesheet Language Transformation (XSLT)--generated standardized Core Dataset ODM XML files. Using this tool, data from different source systems were extracted and pooled for joined analysis in a proof-of-concept study, facilitating both, basic syntactic and semantic interoperability.

  13. Tracking fin whales in the northeast Pacific Ocean with a seafloor seismic network.

    PubMed

    Wilcock, William S D

    2012-10-01

    Ocean bottom seismometer (OBS) networks represent a tool of opportunity to study fin and blue whales. A small OBS network on the Juan de Fuca Ridge in the northeast Pacific Ocean in ~2.3 km of water recorded an extensive data set of 20-Hz fin whale calls. An automated method has been developed to identify arrival times based on instantaneous frequency and amplitude and to locate calls using a grid search even in the presence of a few bad arrival times. When only one whale is calling near the network, tracks can generally be obtained up to distances of ~15 km from the network. When the calls from multiple whales overlap, user supervision is required to identify tracks. The absolute and relative amplitudes of arrivals and their three-component particle motions provide additional constraints on call location but are not useful for extending the distance to which calls can be located. The double-difference method inverts for changes in relative call locations using differences in residuals for pairs of nearby calls recorded on a common station. The method significantly reduces the unsystematic component of the location error, especially when inconsistencies in arrival time observations are minimized by cross-correlation.

  14. Development and Overview of CPAS Sasquatch Airdrop Landing Location Predictor Software

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin J.; Bernatovich, Michael A.

    2015-01-01

    The Capsule Parachute Assembly System (CPAS) is the parachute system for NASA's Orion spacecraft. CPAS is currently in the Engineering Development Unit (EDU) phase of testing. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish a release point from the aircraft that will ensure that the article and all items released from it during flight will land in a designated safe area. The Sasquatch footprint tool was developed to determine this safe release point and to predict the probable landing locations (footprints) of the payload and all released objects. In 2012, a new version of Sasquatch, called Sasquatch Polygons, was developed that significantly upgraded the capabilities of the footprint tool. Key improvements were an increase in the accuracy of the predictions, and the addition of an interface with the Debris Tool (DT), an in-flight debris avoidance tool for use on the test observation helicopter. Additional enhancements include improved data presentation for communication with test personnel and a streamlined code structure. This paper discusses the development, validation, and performance of Sasquatch Polygons, as well as its differences from the original Sasquatch footprint tool.

  15. Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort.

    PubMed

    Gambin, Tomasz; Akdemir, Zeynep C; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M B; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M; Eldomery, Mohammad K; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W; Boerwinkle, Eric; Beaudet, Arthur L; Gibbs, Richard A; Lupski, James R

    2017-02-28

    We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor-Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17-50% of pathogenic CNVs in different disease cohorts where 7.1-11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Use of recombinant salmonella flagellar hook protein (flgk) for detection of anti-salmonella antibodies in chickens by automated capillary immunoassay

    USDA-ARS?s Scientific Manuscript database

    Background: Conventional immunoblot assays are a very useful tool for specific protein identification, but are tedious, labor-intensive and time-consuming. An automated capillary electrophoresis-based immunoblot assay called "Simple Western" has recently been developed that enables the protein sepa...

  17. Astronomy in Early Childhood Education: A Concept-Based Approach

    ERIC Educational Resources Information Center

    Ampartzaki, Maria; Kalogiannakis, Michail

    2016-01-01

    In an attempt to understand the natural world's phenomena, young children form their perceptions of different aspects of the macrocosm, which they contrast with new scientific concepts. This process calls for an early intervention that will provide the stimuli and the tools for the development of new concepts, ideas, and cognitive structures. The…

  18. The Human Exposure Model (HEM): A Tool to Support Rapid Assessment of Human Health Impacts from Near-Field Consumer Product Exposures

    EPA Science Inventory

    The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposur...

  19. Cultivating Critical Game Makers in Digital Game-Based Learning: Learning from the Arts

    ERIC Educational Resources Information Center

    Denham, André R.; Guyotte, Kelly W.

    2018-01-01

    Digital games have the potential of being a transformative tool for applying constructionist principles to learning within formal and informal learning settings. Unfortunately, most recent attention has focused on instructionist games. Connected gaming provides a tantalizing alternative approach by calling for the development of games that are…

  20. "You've Got the Power": Documentary Film as a Tool of Environmental Adult Education

    ERIC Educational Resources Information Center

    Clover, Darlene E.

    2011-01-01

    Educators call for more creative means to combat the moribund narratives of contemporary environmentalism. Using visual methodology and environmental adult education theory, this article discusses how a documentary film titled "You've Got the Power" works to pose questions about complex environmental issues and develop critical thinking…

  1. Learning Geometry by Designing Persian Mosaics

    ERIC Educational Resources Information Center

    Karssenberg, Goossen

    2014-01-01

    To encourage students to do geometry, the art of Islamic geometric ornamentation was chosen as the central theme of a lesson strand which was developed using the newly presented didactical tool called "Learning by Acting". The Dutch students who took these lessons in 2010 to 2013 were challenged to act as if they themselves were Persian…

  2. Developing Computer Programming Concepts and Skills via Technology-Enriched Language-Art Projects: A Case Study

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2010-01-01

    Teaching computer programming to young children has been considered difficult because of its abstract and complex nature. The objectives of this study are (1) to investigate whether an innovative educational technology tool called Scratch could enable young children to learn abstract knowledge of computer programming while creating multimedia…

  3. Exploring the Validity of a Second Language Intercultural Pragmatics Assessment Tool

    ERIC Educational Resources Information Center

    Timpe-Laughlin, Veronika; Choi, Ikkyu

    2017-01-01

    Pragmatics has been a key component of language competence frameworks. While the majority of second/foreign language (L2) pragmatics tests have targeted productive skills, the assessment of receptive pragmatic skills remains a developing field. This study explores validation evidence for a test of receptive L2 pragmatic ability called the American…

  4. Building the Technology Toolkit of Marketing Students: The Emerging Technologies in Marketing Initiative

    ERIC Educational Resources Information Center

    Miller, Fred L.; Mangold, W. Glynn; Roach, Joy; Holmes, Terry

    2013-01-01

    New information technologies are transforming marketing practice, leading to calls for marketing academics to focus their research and teaching more tightly on areas relevant to practitioners. Developments in e-commerce, business geographic information systems (GIS), and social media offer powerful marketing tools to nontechnical users. This paper…

  5. The Miller Motivation Scale: A New Counselling and Research Tool.

    ERIC Educational Resources Information Center

    Miller, Harold J.

    The Miller Motivation Scale is a 160-item computer scored scale. It was developed to measure quickly and easily and display the motivational profile of the client. It has eight subscales. Five subscales measure encouragement, self-fulfillment and social interest. They are called Creative, Innovative, Productive, Cooperative, and Power. Three…

  6. Relationships among Learning Styles and Motivation with Computer-Aided Instruction in an Agronomy Course

    ERIC Educational Resources Information Center

    McAndrews, Gina M.; Mullen, Russell E.; Chadwick, Scott A.

    2005-01-01

    Multi-media learning tools were developed to enhance student learning for an introductory agronomy course at Iowa State University. During fall 2002, the new interactive computer program, called Computer Interactive Multimedia Program for Learning Enhancement (CIMPLE) was incorporated into the teaching, learning, and assessment processes of the…

  7. 75 FR 41173 - Call for Information: Information on Greenhouse Gas Emissions Associated With Bioenergy and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ... oversimplify a complex issue. If this is the case, what alternative approaches or additional analytical tools... Information to solicit information and viewpoints from interested parties on approaches to accounting for... comment on developing an approach for such emissions under the Prevention of Significant Deterioration...

  8. The Portable Usability Testing Lab: A Flexible Research Tool.

    ERIC Educational Resources Information Center

    Hale, Michael E.; And Others

    A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to…

  9. Managing Change to a Quality Philosophy: A Partnership Perspective.

    ERIC Educational Resources Information Center

    Snyder, Karolyn J.; Acker-Hocevar, Michele

    Within the past 5 years there has been an international movement to adapt the principles and practices of Total Quality Management work environments to school-restructuring agendas. This paper reports on the development of a model called the Educational Quality System, a benchmark assessment tool for identifying the essential elements of quality…

  10. UserTesting.com: A Tool for Usability Testing of Online Resources

    ERIC Educational Resources Information Center

    Koundinya, Vikram; Klink, Jenna; Widhalm, Melissa

    2017-01-01

    Extension educators are increasingly using online resources in their program design and delivery. Usability testing is essential for ensuring that these resources are relevant and useful to learners. On the basis of our experiences with iteratively developing products using a testing service called UserTesting, we promote the use of fee-based…

  11. The Need to Update Space Planning Policies for the California Community Colleges. Fact Sheet 05-07

    ERIC Educational Resources Information Center

    California Postsecondary Education Commission, 2005

    2005-01-01

    California plans its development of public higher education facilities using policies called "space and utilization" guidelines and standards. These are budgetary planning tools that can measure existing and future need for academic spaces such as classrooms, laboratories, research space, and faculty offices. California's current space…

  12. Using TI-Nspire in a Modelling Teacher's Training Course

    ERIC Educational Resources Information Center

    Flores, Ángel Homero; Gómez, Adriana; Chávez, Xochitl

    2015-01-01

    Using Mathematical Modelling has become a useful tool in teaching-learning mathematics at all levels. This is so because mathematical objects are seen from their very applications, giving them meaning from the beginning. In this paper we present some details on the development of a teacher's training course called Modelling in the Teaching of…

  13. An innovative approach to compensator design

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Mcdaniel, W. L., Jr.

    1973-01-01

    The design is considered of a computer-aided-compensator for a control system from a frequency domain point of view. The design technique developed is based on describing the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. To do this, several definitions in regard to measuring the performance of a system in the frequency domain are given, e.g., relative stability, relative attenuation, proper phasing, etc. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. This tool is called the constraint improvement algorithm. Then for applying the constraint improvement algorithm generalized, gradients for the constraints are derived. Finally, the necessary theory is incorporated in a Computer program called CIP (compensator Improvement Program). The practical usefulness of CIP is demonstrated by two large system examples.

  14. Mechanical Property Analysis in the Retracted Pin-Tool (RPT) Region of Friction Stir Welded (FSW) Aluminum Lithium 2195

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey; Oelgoetz, Peter A.

    1999-01-01

    The "Auto-Adjustable Pin Tool for Friction Stir Welding", was developed at The Marshall Space Flight Center to address process deficiencies unique to the FSW process. The auto-adjustable pin tool, also called the retractable pin-tool (R.PT) automatically withdraws the welding probe of the pin-tool into the pin-tool's shoulder. The primary function of the auto-adjustable pin-tool is to allow for keyhole closeout, necessary for circumferential welding and localized weld repair, and, automated pin-length adjustment for the welding of tapered material thickness. An overview of the RPT hardware is presented. The paper follows with studies conducted using the RPT. The RPT was used to simulate two capabilities; welding tapered material thickness and closing out the keyhole in a circumferential weld. The retracted pin-tool regions in aluminum- lithium 2195 friction stir weldments were studied through mechanical property testing and metallurgical sectioning. Correlation's can be =de between retractable pin-tool programmed parameters, process parameters, microstructure, and resulting weld quality.

  15. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  16. Visualization and interaction tools for aerial photograph mosaics

    NASA Astrophysics Data System (ADS)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  17. A Unified Overset Grid Generation Graphical Interface and New Concepts on Automatic Gridding Around Surface Discontinuities

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Akien, Edwin (Technical Monitor)

    2002-01-01

    For many years, generation of overset grids for complex configurations has required the use of a number of different independently developed software utilities. Results created by each step were then visualized using a separate visualization tool before moving on to the next. A new software tool called OVERGRID was developed which allows the user to perform all the grid generation steps and visualization under one environment. OVERGRID provides grid diagnostic functions such as surface tangent and normal checks as well as grid manipulation functions such as extraction, extrapolation, concatenation, redistribution, smoothing, and projection. Moreover, it also contains hyperbolic surface and volume grid generation modules that are specifically suited for overset grid generation. It is the first time that such a unified interface existed for the creation of overset grids for complex geometries. New concepts on automatic overset surface grid generation around surface discontinuities will also be briefly presented. Special control curves on the surface such as intersection curves, sharp edges, open boundaries, are called seam curves. The seam curves are first automatically extracted from a multiple panel network description of the surface. Points where three or more seam curves meet are automatically identified and are called seam corners. Seam corner surface grids are automatically generated using a singular axis topology. Hyperbolic surface grids are then grown from the seam curves that are automatically trimmed away from the seam corners.

  18. Improved Load Alleviation Capability for the KC-135

    DTIC Science & Technology

    1997-09-01

    software, such as Matlab, Mathematica, Simulink, and Robotica Front End for Mathematica available in the simulation laboratory Overview This thesis report is...outlined in Spong’s text in order to utilize the Robotica system development software which automates the process of calculating the kinematic and...kinematic and dynamic equations can be accomplished using a computer tool called Robotica Front End (RFE) [ 15], developed by Doctor Spong. Boom Root d3

  19. DengueTools: innovative tools and strategies for the surveillance and control of dengue.

    PubMed

    Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane

    2012-01-01

    Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.

  20. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.

  1. The VO-Dance web application at the IA2 data center

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo

    2012-09-01

    Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.

  2. Single-molecule fluorescence microscopy review: shedding new light on old problems

    PubMed Central

    Shashkova, Sviatlana

    2017-01-01

    Fluorescence microscopy is an invaluable tool in the biosciences, a genuine workhorse technique offering exceptional contrast in conjunction with high specificity of labelling with relatively minimal perturbation to biological samples compared with many competing biophysical techniques. Improvements in detector and dye technologies coupled to advances in image analysis methods have fuelled recent development towards single-molecule fluorescence microscopy, which can utilize light microscopy tools to enable the faithful detection and analysis of single fluorescent molecules used as reporter tags in biological samples. For example, the discovery of GFP, initiating the so-called ‘green revolution’, has pushed experimental tools in the biosciences to a completely new level of functional imaging of living samples, culminating in single fluorescent protein molecule detection. Today, fluorescence microscopy is an indispensable tool in single-molecule investigations, providing a high signal-to-noise ratio for visualization while still retaining the key features in the physiological context of native biological systems. In this review, we discuss some of the recent discoveries in the life sciences which have been enabled using single-molecule fluorescence microscopy, paying particular attention to the so-called ‘super-resolution’ fluorescence microscopy techniques in live cells, which are at the cutting-edge of these methods. In particular, how these tools can reveal new insights into long-standing puzzles in biology: old problems, which have been impossible to tackle using other more traditional tools until the emergence of new single-molecule fluorescence microscopy techniques. PMID:28694303

  3. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  4. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  5. REopt Lite Web Tool Evaluates Photovoltaics and Battery Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building on the success of the REopt renewable energy integration and optimization platform, NREL has developed a free, publicly available web version of REopt called REopt Lite. REopt Lite evaluates the economics of grid-connected photovoltaics (PV) and battery storage at a site. It allows building owners to identify the system sizes and battery dispatch strategy that minimize their life cycle cost of energy. This web tool also estimates the amount of time a PV and storage system can sustain the site's critical load during a grid outage.

  6. Development of a Suicidal Ideation Detection Tool for Primary Healthcare Settings: Using Open Access Online Psychosocial Data.

    PubMed

    Meyer, Denny; Abbott, Jo-Anne; Rehm, Imogen; Bhar, Sunil; Barak, Azy; Deng, Gary; Wallace, Klaire; Ogden, Edward; Klein, Britt

    2017-04-01

    Suicidal patients often visit healthcare professionals in their last month before suicide, but medical practitioners are unlikely to raise the issue of suicide with patients because of time constraints and uncertainty regarding an appropriate approach. A brief tool called the e-PASS Suicidal Ideation Detector (eSID) was developed for medical practitioners to help detect the presence of suicidal ideation (SI) in their clients. If SI is detected, the system alerts medical practitioners to address this issue with a client. The eSID tool was developed due to the absence of an easy-to-use, evidence-based SI detection tool for general practice. The tool was developed using binary logistic regression analyses of data provided by clients accessing an online psychological assessment function. Ten primary healthcare professionals provided advice regarding the use of the tool. The analysis identified eleven factors in addition to the Kessler-6 for inclusion in the model used to predict the probability of recent SI. The model performed well across gender and age groups 18-64 (AUR 0.834, 95% CI 0.828-0.841, N = 16,703). Healthcare professionals were interviewed; they recommended that the tool be incorporated into existing medical software systems and that additional resources be supplied, tailored to the level of risk identified. The eSID is expected to trigger risk assessments by healthcare professionals when this is necessary. Initial reactions of healthcare professionals to the tool were favorable, but further testing and in situ development are required.

  7. Algorithm Building and Learning Programming Languages Using a New Educational Paradigm

    NASA Astrophysics Data System (ADS)

    Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel

    2011-08-01

    This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.

  8. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  9. A Practical Guide to the Technology and Adoption of Software Process Automation

    DTIC Science & Technology

    1994-03-01

    IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4

  10. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  11. Barriers to innovation: nurses' risk appraisal in using a new ethics screening and early intervention tool.

    PubMed

    Pavlish, Carol L; Hellyer, Joan Henriksen; Brown-Saltzman, Katherine; Miers, Anne G; Squire, Karina

    2013-01-01

    We developed and assessed feasibility of an Ethics Screening and Early Intervention Tool that identifies at-risk clinical situations and prompts early actions to mitigate conflict and moral distress. Despite intensive care unit and oncology nurses' reports of tool benefits, they noted some risk to themselves when initiating follow-up actions. The riskiest actions were discussing ethical concerns with physicians, calling for ethics consultation, and initiating patient conversations. When discussing why initiating action was risky, participants revealed themes such as "being the troublemaker" and "questioning myself." To improve patient care and teamwork, all members of the health care team need to feel safe in raising ethics-related questions.

  12. NEuro COgnitive REhabilitation for Disease of Addiction (NECOREDA) Program: From Development to Trial

    PubMed Central

    Rezapour, Tara; Hatami, Javad; Farhoudian, Ali; Sofuoglu, Mehmet; Noroozi, Alireza; Daneshmand, Reza; Samiei, Ahmadreza; Ekhtiari, Hamed

    2015-01-01

    Despite extensive evidence for cognitive deficits associated with drug use and multiple publications supporting the efficacy of cognitive rehabilitation treatment (CRT) services for drug addictions, there are a few well-structured tools and organized programs to improve cognitive abilities in substance users. Most published studies on cognitive rehabilitation for drug dependent patients used rehabilitation tools, which have been previously designed for other types of brain injuries such as schizophrenia or traumatic brain injuries and not specifically designed for drug dependent patients. These studies also suffer from small sample size, lack of follow-up period assessments and or comprehensive treatment outcome measures. To address these limitations, we decided to develop and investigate the efficacy of a paper and pencil cognitive rehabilitation package called NECOREDA (Neurocognitive Rehabilitation for Disease of Addiction) to improve neurocognitive deficits associated with drug dependence particularly caused by stimulants (e.g. amphetamine type stimulants and cocaine) and opiates. To evaluate the feasibility of NECOREDA program, we conducted a pilot study with 10 opiate and methamphetamine dependent patients for 3 months in outpatient setting. NECOREDA was revised based on qualitative comments received from clients and treatment providers. Final version of NECOREDA is composed of brain training exercises called “Brain Gym” and psychoeducational modules called “Brain Treasures” which is implemented in 16 training sessions interleaved with 16 review and practice sessions. NECOREDA will be evaluated as an add-on intervention to methadone maintenance treatment in a randomized clinical trial among opiate dependent patients starting from August 2015. We discuss methodological features of NECOREDA development and evaluation in this article. PMID:26649167

  13. Systematic comparison of variant calling pipelines using gold standard personal exome variants

    PubMed Central

    Hwang, Sohyun; Kim, Eiru; Lee, Insuk; Marcotte, Edward M.

    2015-01-01

    The success of clinical genomics using next generation sequencing (NGS) requires the accurate and consistent identification of personal genome variants. Assorted variant calling methods have been developed, which show low concordance between their calls. Hence, a systematic comparison of the variant callers could give important guidance to NGS-based clinical genomics. Recently, a set of high-confident variant calls for one individual (NA12878) has been published by the Genome in a Bottle (GIAB) consortium, enabling performance benchmarking of different variant calling pipelines. Based on the gold standard reference variant calls from GIAB, we compared the performance of thirteen variant calling pipelines, testing combinations of three read aligners—BWA-MEM, Bowtie2, and Novoalign—and four variant callers—Genome Analysis Tool Kit HaplotypeCaller (GATK-HC), Samtools mpileup, Freebayes and Ion Proton Variant Caller (TVC), for twelve data sets for the NA12878 genome sequenced by different platforms including Illumina2000, Illumina2500, and Ion Proton, with various exome capture systems and exome coverage. We observed different biases toward specific types of SNP genotyping errors by the different variant callers. The results of our study provide useful guidelines for reliable variant identification from deep sequencing of personal genomes. PMID:26639839

  14. Occupational voice demands and their impact on the call-centre industry.

    PubMed

    Hazlett, D E; Duffy, O M; Moorhead, S A

    2009-04-20

    Within the last decade there has been a growth in the call-centre industry in the UK, with a growing awareness of the voice as an important tool for successful communication. Occupational voice problems such as occupational dysphonia, in a business which relies on healthy, effective voice as the primary professional communication tool, may threaten working ability and occupational health and safety of workers. While previous studies of telephone call-agents have reported a range of voice symptoms and functional vocal health problems, there have been no studies investigating the use and impact of vocal performance in the communication industry within the UK. This study aims to address a significant gap in the evidence-base of occupational health and safety research. The objectives of the study are: 1. to investigate the work context and vocal communication demands for call-agents; 2. to evaluate call-agents' vocal health, awareness and performance; and 3. to identify key risks and training needs for employees and employers within call-centres. This is an occupational epidemiological study, which plans to recruit call-centres throughout the UK and Ireland. Data collection will consist of three components: 1. interviews with managers from each participating call-centre to assess their communication and training needs; 2. an online biopsychosocial questionnaire will be administered to investigate the work environment and vocal demands of call-agents; and 3. voice acoustic measurements of a random sample of participants using the Multi-dimensional Voice Program (MDVP). Qualitative content analysis from the interviews will identify underlying themes and issues. A multivariate analysis approach will be adopted using Structural Equation Modelling (SEM), to develop voice measurement models in determining the construct validity of potential factors contributing to occupational dysphonia. Quantitative data will be analysed using SPSS version 15. Ethical approval is granted for this study from the School of Communication, University of Ulster. The results from this study will provide the missing element of voice-based evidence, by appraising the interactional dimensions of vocal health and communicative performance. This information will be used to inform training for call-agents and to contribute to health policies within the workplace, in order to enhance vocal health.

  15. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  16. Production of recombinant Salmonella flagellar protein, FlgK, and its uses in detection of anti-Salmonella antibodies in chickens by automated capillary immunoassay

    USDA-ARS?s Scientific Manuscript database

    Conventional immunoblot assays have been a very useful tool for specific protein identification in the past several decades, but are tedious, labor-intensive and time-consuming. An automated capillary electrophoresis-based immunoblot assay called "Simple Western" has recently been developed that en...

  17. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  18. Sports Business Unit Meets Cross-Curricular Learning Goals: Grades 9-12

    ERIC Educational Resources Information Center

    Curriculum Review, 2006

    2006-01-01

    A new online learning tool called the eCommSports Kit links a seven-step sports marketing curriculum with a school team to give students real-life experience in developing and executing a plan to boost game attendance. The kit, available through http://www.ecommsports.com, takes teens on a cross-curricular journey through conducting business…

  19. Teaching 2.0: Teams Keep Teachers and Students Plugged into Technology

    ERIC Educational Resources Information Center

    Bourgeois, Michelle; Hunt, Bud

    2011-01-01

    A Colorado district develops a two-year program that gives teacher teams an opportunity to learn how to use digital tools in the classroom. Called the Digital Learning Collaborative, it is built on three things about professional learning: (1) Learning takes time; (2) Learning is a social process; and (3) Learning about technology should be…

  20. Forage resource evaluation system for habitat—deer: an interactive deer habitat model

    Treesearch

    Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris

    2012-01-01

    We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...

  1. Computing Education in Children's Early Years: A Call for Debate

    ERIC Educational Resources Information Center

    Manches, Andrew; Plowman, Lydia

    2017-01-01

    International changes in policy and curricula (notably recent developments in England) have led to a focus on the role of computing education in the early years. As interest in the potential of computing education has increased, there has been a proliferation of programming tools designed for young children. While these changes are broadly to be…

  2. Temperature Coefficient for Modeling Denitrification in Surface Water Sediments Using the Mass Transfer Coefficient

    Treesearch

    T. W. Appelboom; G. M. Chescheir; R. W. Skaggs; J. W. Gilliam; Devendra M. Amatya

    2006-01-01

    Watershed modeling has become an important tool for researchers with the high costs of water quality monitoring. When modeling nitrate transport within drainage networks, denitrification within the sediments needs to be accounted for. Birgand et. al. developed an equation using a term called a mass transfer coefficient to mathematically describe sediment...

  3. Using Concept Mapping as as Tool for Program Theory Development

    ERIC Educational Resources Information Center

    Orsi, Rebecca

    2011-01-01

    The purpose of this methodological study is to explore how well a process called "concept mapping" (Trochim, 1989) can articulate the theory which underlies a social program. Articulation of a program's theory is a key step in completing a sound theory based evaluation (Weiss, 1997a). In this study, concept mapping is used to…

  4. Horse Racing at the Library: How One Library System Increased the Usage of Some of Its Online Databases

    ERIC Educational Resources Information Center

    Kurhan, Scott H.; Griffing, Elizabeth A.

    2011-01-01

    Reference services in public libraries are changing dramatically. The Internet, online databases, and shrinking budgets are all making it necessary for non-traditional reference staff to become familiar with online reference tools. Recognizing the need for cross-training, Chesapeake Public Library (CPL) developed a program called the Database…

  5. Towards a Pedagogical Model for Science Education: Bridging Educational Contexts through a Blended Learning Approach

    ERIC Educational Resources Information Center

    Bidarra, José; Rusman, Ellen

    2017-01-01

    This paper proposes a design framework to support science education through blended learning, based on a participatory and interactive approach supported by ICT-based tools, called "Science Learning Activities Model" (SLAM). The development of this design framework started as a response to complex changes in society and education (e.g.…

  6. Temperature coefficient for modeling denitrification in surface water sediments using the mass transfer coefficient

    Treesearch

    T.W. Appelboom; G.M. Chescheir; F. Birgand; R.W. Skaggs; J.W. Gilliam; D. Amatya

    2010-01-01

    Watershed modeling has become an important tool for researchers. Modeling nitrate transport within drainage networks requires quantifying the denitrification within the sediments in canals and streams. In a previous study, several of the authors developed an equation using a term called a mass transfer coefficient to mathematically describe sediment denitrification....

  7. Temperature coefficient for modeling denitrification in surface water sediments using the mass transfer coefficient.

    Treesearch

    T.W. Appelboom; G.M. Chescheir; F. Birgand; R.W. Skaggs; J.W. Gilliam; D. Amatya

    2010-01-01

    Watershed modeling has become an important tool for researchers. Modeling nitrate transport within drainage networks requires quantifying the denitrification within the sediments in canals and streams. In a previous study, several of the authors developed an equation using a term called a mass transfer coefficient to mathematically describe sediment denitrification....

  8. Miscue Analysis: A Transformative Tool for Researchers, Teachers, and Readers

    ERIC Educational Resources Information Center

    Goodman, Yetta M.

    2015-01-01

    When a reader produces a response to a written text (the observed response) that is not expected by the listener, the result is called a miscue. Using psychosociolingustic analyses of miscues in the context of an authentic text, miscue analysis provides evidence to discover how readers read. I present miscue analysis history and development and…

  9. GRIDVIEW: Recent Improvements in Research and Education Software for Exploring Mars Topography

    NASA Technical Reports Server (NTRS)

    Roark, J. H.; Frey, H. V.

    2001-01-01

    We have developed an Interactive Data Language (IDL) scientific visualization software tool called GRIDVIEW that can be used in research and education to explore and study the most recent Mars Orbiter Laser Altimeter (MOLA) gridded topography of Mars (http://denali.gsfc.nasa.gov/mola_pub/gridview). Additional information is contained in the original extended abstract.

  10. The ZAP Project: Designing Interactive Computer Tools for Learning Psychology

    ERIC Educational Resources Information Center

    Hulshof, Casper; Eysink, Tessa; de Jong, Ton

    2006-01-01

    In the ZAP project, a set of interactive computer programs called "ZAPs" was developed. The programs were designed in such a way that first-year students experience psychological phenomena in a vivid and self-explanatory way. Students can either take the role of participant in a psychological experiment, they can experience phenomena themselves,…

  11. MENDEL: An Intelligent Computer Tutoring System for Genetics Problem-Solving, Conjecturing, and Understanding.

    ERIC Educational Resources Information Center

    Streibel, Michael; And Others

    1987-01-01

    Describes an advice-giving computer system being developed for genetics education called MENDEL that is based on research in learning, genetics problem solving, and expert systems. The value of MENDEL as a design tool and the tutorial function are stressed. Hypothesis testing, graphics, and experiential learning are also discussed. (Author/LRW)

  12. The P.E.A.C.E. Pack: A Computerized Online Assessment of School Bullying

    ERIC Educational Resources Information Center

    Slee, Phillip T.; Mohyla, Jury

    2014-01-01

    School bullying is an international problem with harmful outcomes for those involved. This study describes the design and field testing of an innovative computer-based social learning tool for assessing student perceptions of bullying developed for an Australian intervention program called the P.E.A.C.E. Pack. Students rate their peer group…

  13. REopt Lite Web Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NREL developed a free, publicly available web version of the REopt (TM) renewable energy integration and optimization platform called REopt Lite. REopt Lite recommends the optimal size and dispatch strategy for grid-connected photovoltaics (PV) and battery storage at a site. It also allows users to explore how PV and storage can increase a site's resiliency during a grid outage.

  14. Developing Strategic and Reasoning Abilities with Computer Games at Primary School Level

    ERIC Educational Resources Information Center

    Bottino, R. M.; Ferlino, L.; Ott, M.; Tavella, M.

    2007-01-01

    The paper reports a small-scale, long-term pilot project designed to foster strategic and reasoning abilities in young primary school pupils by engaging them in a number of computer games, mainly those usually called mind games (brainteasers, puzzlers, etc.). In this paper, the objectives, work methodology, experimental setting, and tools used in…

  15. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  16. Evaluation of somatic copy number estimation tools for whole-exome sequencing data.

    PubMed

    Nam, Jae-Yong; Kim, Nayoung K D; Kim, Sang Cheol; Joung, Je-Gun; Xi, Ruibin; Lee, Semin; Park, Peter J; Park, Woong-Yang

    2016-03-01

    Whole-exome sequencing (WES) has become a standard method for detecting genetic variants in human diseases. Although the primary use of WES data has been the identification of single nucleotide variations and indels, these data also offer a possibility of detecting copy number variations (CNVs) at high resolution. However, WES data have uneven read coverage along the genome owing to the target capture step, and the development of a robust WES-based CNV tool is challenging. Here, we evaluate six WES somatic CNV detection tools: ADTEx, CONTRA, Control-FREEC, EXCAVATOR, ExomeCNV and Varscan2. Using WES data from 50 kidney chromophobe, 50 bladder urothelial carcinoma, and 50 stomach adenocarcinoma patients from The Cancer Genome Atlas, we compared the CNV calls from the six tools with a reference CNV set that was identified by both single nucleotide polymorphism array 6.0 and whole-genome sequencing data. We found that these algorithms gave highly variable results: visual inspection reveals significant differences between the WES-based segmentation profiles and the reference profile, as well as among the WES-based profiles. Using a 50% overlap criterion, 13-77% of WES CNV calls were covered by CNVs from the reference set, up to 21% of the copy gains were called as losses or vice versa, and dramatic differences in CNV sizes and CNV numbers were observed. Overall, ADTEx and EXCAVATOR had the best performance with relatively high precision and sensitivity. We suggest that the current algorithms for somatic CNV detection from WES data are limited in their performance and that more robust algorithms are needed. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. Progress on automated data analysis algorithms for ultrasonic inspection of composites

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Forsyth, David S.; Welter, John T.

    2015-03-01

    Progress is presented on the development and demonstration of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. New algorithms have been implemented to reliably identify indications in time-of-flight images near the front and back walls of composite panels. Adaptive call criteria have also been applied to address sensitivity to variation in backwall signal level, panel thickness variation, and internal signal noise. ADA processing results are presented for a variety of test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions. Software tools have been developed to support both ADA algorithm design and certification, producing a statistical evaluation of indication results and false calls using a matching process with predefined truth tables. Parametric studies were performed to evaluate detection and false call results with respect to varying algorithm settings.

  18. Evaluation of a simple method for the automatic assignment of MeSH descriptors to health resources in a French online catalogue.

    PubMed

    Névéol, Aurélie; Pereira, Suzanne; Kerdelhué, Gaetan; Dahamna, Badisse; Joubert, Michel; Darmoni, Stéfan J

    2007-01-01

    The growing number of resources to be indexed in the catalogue of online health resources in French (CISMeF) calls for curating strategies involving automatic indexing tools while maintaining the catalogue's high indexing quality standards. To develop a simple automatic tool that retrieves MeSH descriptors from documents titles. In parallel to research on advanced indexing methods, a bag-of-words tool was developed for timely inclusion in CISMeF's maintenance system. An evaluation was carried out on a corpus of 99 documents. The indexing sets retrieved by the automatic tool were compared to manual indexing based on the title and on the full text of resources. 58% of the major main headings were retrieved by the bag-of-words algorithm and the precision on main heading retrieval was 69%. Bag-of-words indexing has effectively been used on selected resources to be included in CISMeF since August 2006. Meanwhile, on going work aims at improving the current version of the tool.

  19. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  20. Southern African Office of Astronomy for Development: A New Hub for Astronomy for Development

    NASA Astrophysics Data System (ADS)

    Mutondo, Moola S.; Simpemba, Prospery

    2016-10-01

    A new Astronomy for Development hub needs innovative tools and programs. SAROAD is developing exciting tools integrating Raspberry Pi technology to bring cost-effective astronomy content to learning centres. SAROAD would also like to report achievements in realizing the IAU's strategic plan. In order to manage, evaluate and coordinate regional IAU (International Astronomical Union) capacity building programmes, including the recruitment and mobilization of volunteers, SAROAD has built an intranet that is accessible to regional members upon request. Using this resource, regional members can see and participate in regional activities. SAROAD has commenced with projects in the three Task Force areas of Universities and Research, Children and Schools and Public Outreach. Under the three Task Force areas, a total of seven projects have commenced in Zambia (some supported by funds from IAU Annual Call for proposals).

  1. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Astrophysics Data System (ADS)

    Jaggi, S.

    1993-02-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  2. Development of Gis Tool for the Solution of Minimum Spanning Tree Problem using Prim's Algorithm

    NASA Astrophysics Data System (ADS)

    Dutta, S.; Patra, D.; Shankar, H.; Alok Verma, P.

    2014-11-01

    minimum spanning tree (MST) of a connected, undirected and weighted network is a tree of that network consisting of all its nodes and the sum of weights of all its edges is minimum among all such possible spanning trees of the same network. In this study, we have developed a new GIS tool using most commonly known rudimentary algorithm called Prim's algorithm to construct the minimum spanning tree of a connected, undirected and weighted road network. This algorithm is based on the weight (adjacency) matrix of a weighted network and helps to solve complex network MST problem easily, efficiently and effectively. The selection of the appropriate algorithm is very essential otherwise it will be very hard to get an optimal result. In case of Road Transportation Network, it is very essential to find the optimal results by considering all the necessary points based on cost factor (time or distance). This paper is based on solving the Minimum Spanning Tree (MST) problem of a road network by finding it's minimum span by considering all the important network junction point. GIS technology is usually used to solve the network related problems like the optimal path problem, travelling salesman problem, vehicle routing problems, location-allocation problems etc. Therefore, in this study we have developed a customized GIS tool using Python script in ArcGIS software for the solution of MST problem for a Road Transportation Network of Dehradun city by considering distance and time as the impedance (cost) factors. It has a number of advantages like the users do not need a greater knowledge of the subject as the tool is user-friendly and that allows to access information varied and adapted the needs of the users. This GIS tool for MST can be applied for a nationwide plan called Prime Minister Gram Sadak Yojana in India to provide optimal all weather road connectivity to unconnected villages (points). This tool is also useful for constructing highways or railways spanning several cities optimally or connecting all cities with minimum total road length.

  3. VCS: Tool for Visualizing Copy Number Variation and Single Nucleotide Polymorphism.

    PubMed

    Kim, HyoYoung; Sung, Samsun; Cho, Seoae; Kim, Tae-Hun; Seo, Kangseok; Kim, Heebal

    2014-12-01

    Copy number variation (CNV) or single nucleotide phlyorphism (SNP) is useful genetic resource to aid in understanding complex phenotypes or deseases susceptibility. Although thousands of CNVs and SNPs are currently avaliable in the public databases, they are somewhat difficult to use for analyses without visualization tools. We developed a web-based tool called the VCS (visualization of CNV or SNP) to visualize the CNV or SNP detected. The VCS tool can assist to easily interpret a biological meaning from the numerical value of CNV and SNP. The VCS provides six visualization tools: i) the enrichment of genome contents in CNV; ii) the physical distribution of CNV or SNP on chromosomes; iii) the distribution of log2 ratio of CNVs with criteria of interested; iv) the number of CNV or SNP per binning unit; v) the distribution of homozygosity of SNP genotype; and vi) cytomap of genes within CNV or SNP region.

  4. TRIP-ID: A tool for a smart and interactive identification of Magic Formula tyre model parameters from experimental data acquired on track or test rig

    NASA Astrophysics Data System (ADS)

    Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco

    2018-03-01

    Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.

  5. Status of the Combustion Devices Injector Technology Program at the NASA MSFC

    NASA Technical Reports Server (NTRS)

    Jones, Gregg; Protz, Christopher; Trinh, Huu; Tucker, Kevin; Nesman, Tomas; Hulka, James

    2005-01-01

    To support the NASA Space Exploration Mission, an in-house program called Combustion Devices Injector Technology (CDIT) is being conducted at the NASA Marshall Space Flight Center (MSFC) for the fiscal year 2005. CDIT is focused on developing combustor technology and analysis tools to improve reliability and durability of upper-stage and in-space liquid propellant rocket engines. The three areas of focus include injector/chamber thermal compatibility, ignition, and combustion stability. In the compatibility and ignition areas, small-scale single- and multi-element hardware experiments will be conducted to demonstrate advanced technological concepts as well as to provide experimental data for validation of computational analysis tools. In addition, advanced analysis tools will be developed to eventually include 3-dimensional and multi- element effects and improve capability and validity to analyze heat transfer and ignition in large, multi-element injectors.

  6. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  7. Usability analysis of 2D graphics software for designing technical clothing.

    PubMed

    Teodoroski, Rita de Cassia Clark; Espíndola, Edilene Zilma; Silva, Enéias; Moro, Antônio Renato Pereira; Pereira, Vera Lucia D V

    2012-01-01

    With the advent of technology, the computer became a working tool increasingly present in companies. Its purpose is to increase production and reduce the inherent errors in manual production. The aim of this study was to analyze the usability of 2D graphics software in creating clothing designs by a professional during his work. The movements of the mouse, keyboard and graphical tools were monitored in real time by software Camtasia 7® installed on the user's computer. To register the use of mouse and keyboard we used auxiliary software called MouseMeter®, which quantifies the number of times they pressed the right, middle and left mouse's buttons, the keyboard and also the distance traveled in meters by the cursor on the screen. Data was collected in periods of 15 minutes, 1 hour and 8 hours, consecutively. The results showed that the job is considered repetitive and high demands physical efforts, which can lead to the appearance of repetitive strain injuries. Thus, the goal of minimizing operator efforts and thereby enhance the usability of the examined tool, becomes imperative to replace the mouse by a device called tablet, which also offers an electronic pen and a drawing platform for design development.

  8. Check and Report Ebola (CARE) Hotline: The User Perspective of an Innovative Tool for Postarrival Monitoring of Ebola in the United States.

    PubMed

    McCarthy, Ilana Olin; Wojno, Abbey E; Joseph, Heather A; Teesdale, Scott

    2017-11-14

    The response to the 2014-2016 Ebola epidemic included an unprecedented effort from federal, state, and local public health authorities to monitor the health of travelers entering the United States from countries with Ebola outbreaks. The Check and Report Ebola (CARE) Hotline, a novel approach to monitoring, was designed to enable travelers to report their health status daily to an interactive voice recognition (IVR) system. The system was tested with 70 Centers for Disease Control and Prevention (CDC) federal employees returning from deployments in outbreak countries. The objective of this study was to describe the development of the CARE Hotline as a tool for postarrival monitoring and examine the usage characteristics and user experience of the tool during a public health emergency. Data were obtained from two sources. First, the CARE Hotline system produced a call log which summarized the usage characteristics of all 70 users' daily health reports. Second, we surveyed federal employees (n=70) who used the CARE Hotline to engage in monitoring. A total of 21 (21/70, 30%) respondents were included in the survey analytic sample. While the CARE Hotline was used for monitoring, 70 users completed a total of 1313 calls. We found that 94.06% (1235/1313) of calls were successful, and the average call time significantly decreased from the beginning of the monitoring period to the end by 32 seconds (Z score=-6.52, P<.001). CARE Hotline call log data were confirmed by user feedback; survey results indicated that users became more familiar with the system and found the system easier to use, from the beginning to the end of their monitoring period. The majority of the users were highly satisfied (90%, 19/21) with the system, indicating ease of use and convenience as primary reasons, and would recommend it for future monitoring efforts (90%, 19/21). The CARE Hotline garnered high user satisfaction, required minimal reporting time from users, and was an easily learned tool for monitoring. This phone-based technology can be modified for future public health emergencies. ©Ilana Olin McCarthy, Abbey E Wojno, Heather A Joseph, Scott Teesdale. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.11.2017.

  9. One project of Educational Innovation applying news Information and Communications Technologies (ICT): CyberAula 2.0

    NASA Astrophysics Data System (ADS)

    Mendiola, M. A.; Aguado, P. L.; Espejo, R.

    2012-04-01

    The main objective of the CyberAula 2.0 project is to support, record and validate videoconferencing and lecture recording services by means the integration of the Polytechnic University of Madrid (UPM) Moodle Platform with both the Global Plaza platform and Isabel videoconference tool. Each class session is broadcast on the Internet and then recorded, enabling geographically distant students to participate in on-campus classes. All the software used in the project is open source. The videoconferencing tool that we have used is called Isabel and the web platform to schedule, perform, stream, record and publish the videoconferences automatically is called GlobalPlaza. Both of them have been developed at UPM (Universidad Politécnica de Madrid) and are specifically designed for educational purposes. Each class session is broadcasted on the Internet and recorded, enabling geographically distant students to participate live in on-campus classes with questions through a chat tool or through the same videoconference. In order to provide educational support to GlobalPlaza, the CyberAula 2.0 project has been proposed. GlobalPlaza (Barra et al., 2011) is the web platform to schedule, perform, stream, record and publish videoconferences automatically. It is integrated with the videoconferencing tool called Isabel (Quemada et al, 2005), which is a real-time collaboration tool for the Internet, which supports advanced collaborative web/videoconferencing with application sharing and TV like media integration. Both of them are open source solutions which have been developed at our university. GlobalPlaza is a web application developed in the context of the GLOBAL project, a research project supported by the European Commission's seventh framework program. Students can review the recording lectures when needed through Moodle. In this paper we present the project developed at the Escuela Técnica Superior de Ingenieros Agrónomos (ETSIA), Secondary Cycle free-elective-subject, which is currently in process of expiration with the introduction of new curricula within the framework of the European Higher Education Space. Students participate in this subject with outstanding interest, thus achieving transversal competences as they must prepare and present a report in the last week of the Semester. The Project development background was the inclusion of the subject Plants of agro-alimentary interest. It has a quite remarkable, attractive practical deal within the Subjects Offer by this Center and it is one of the most demanded subjects of the free elective ones by students, whose active participation can be highlighted, either in practical workshops or in their individual presentation of reports. In the workshops they must identify, describe, classify and even taste several species of agro-alimentary interest (fruits of tempered or tropical regions, aromatic plants and spices, edible mushrooms and cereals and pseudocereals), many of them formerly unknown for the majority. They are asked to fill some questionnaires in order to consolidate concepts and to evaluate their personal participation in the subject development.

  10. What Do Monkey Calls Mean?

    PubMed

    Schlenker, Philippe; Chemla, Emmanuel; Zuberbühler, Klaus

    2016-12-01

    A field of primate linguistics is gradually emerging. It combines general questions and tools from theoretical linguistics with rich data gathered in experimental primatology. Analyses of several monkey systems have uncovered very simple morphological and syntactic rules and have led to the development of a primate semantics that asks new questions about the division of semantic labor between the literal meaning of monkey calls, additional mechanisms of pragmatic enrichment, and the environmental context. We show that comparative studies across species may validate this program and may in some cases help in reconstructing the evolution of monkey communication over millions of years. Copyright © 2016. Published by Elsevier Ltd.

  11. A Multi­Discipline Approach to Digitizing Historic Seismograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Andrew

    2016-04-07

    Retriever Technology has developed and has made available free of charge a seismogram digitization software package called SKATE (Seismogram Kit for Automatic Trace Extraction). We have developed an extensive set of algorithms that process seismogram image files, provide editing tools, and output time series data. The software is available online and free of charge at seismo.redfish.com. To demonstrate the speed and cost effectiveness of the software, we have processed over 30,000 images.

  12. Integrating research tools to support the management of social-ecological systems under climate change

    USGS Publications Warehouse

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  13. Evaluation of roadside emergency call box technology : a summary report : technical assistance report.

    DOT National Transportation Integrated Search

    2003-04-01

    Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...

  14. The Western Aeronautical Test Range. Chapter 10 Tools

    NASA Technical Reports Server (NTRS)

    Knudtson, Kevin; Park, Alice; Downing, Robert; Sheldon, Jack; Harvey, Robert; Norcross, April

    2011-01-01

    The Western Aeronautical Test Range (WATR) staff at the NASA Dryden Flight Research Center is developing a translation software called Chapter 10 Tools in response to challenges posed by post-flight processing data files originating from various on-board digital recorders that follow the Range Commanders Council Inter-Range Instrumentation Group (IRIG) 106 Chapter 10 Digital Recording Standard but use differing interpretations of the Standard. The software will read the date files regardless of the vendor implementation of the source recorder, displaying data, identifying and correcting errors, and producing a data file that can be successfully processed post-flight

  15. Development of a prenatal psychosocial screening tool for post-partum depression and anxiety.

    PubMed

    McDonald, Sheila; Wall, Jennifer; Forbes, Kaitlin; Kingston, Dawn; Kehler, Heather; Vekved, Monica; Tough, Suzanne

    2012-07-01

    Post-partum depression (PPD) is the most common complication of pregnancy in developed countries, affecting 10-15% of new mothers. There has been a shift in thinking less in terms of PPD per se to a broader consideration of poor mental health, including anxiety after giving birth. Some risk factors for poor mental health in the post-partum period can be identified prenatally; however prenatal screening tools developed to date have had poor sensitivity and specificity. The objective of this study was to develop a screening tool that identifies women at risk of distress, operationalized by elevated symptoms of depression and anxiety in the post-partum period using information collected in the prenatal period. Using data from the All Our Babies Study, a prospective cohort study of pregnant women living in Calgary, Alberta (N = 1578), we developed an integer score-based prediction rule for the prevalence of PPD, as defined as scoring 10 or higher on the Edinburgh Postnatal Depression Scale (EPDS) at 4-months postpartum. The best fit model included known risk factors for PPD: depression and stress in late pregnancy, history of abuse, and poor relationship quality with partner. Comparison of the screening tool with the EPDS in late pregnancy showed that our tool had significantly better performance for sensitivity. Further validation of our tool was seen in its utility for identifying elevated symptoms of postpartum anxiety. This research heeds the call for further development and validation work using psychosocial factors identified prenatally for identifying poor mental health in the post-partum period. © 2012 Blackwell Publishing Ltd.

  16. Development of T-STAT for Early Autism Screening

    ERIC Educational Resources Information Center

    Chiang, Chung-Hsin; Wu, Chin-Chin; Hou, Yuh-Ming; Chu, Ching-Lin; Liu, Jiun-Horng; Soong, Wei-Tsuen

    2013-01-01

    This study's purpose was to modify the Screening Tool for Autism in Two-Year-Olds (STAT) into a Taiwanese version called T-STAT. Study 1 included 15 children with Autism and 15 children with Developmental Delay (DD) or language impairment (LI) aged between 24 and 35 months. Study 2 had 77 young children with Autism, PDD-NOS, or DD/LI as a…

  17. Electro-Optic Propagation

    DTIC Science & Technology

    2003-09-30

    Electro - Optic Propagation Stephen Doss-Hammel SPAWARSYSCEN San Diego code 2858 49170 Propagation Path San Diego, CA 92152-7385 phone: (619...scenarios to extend the capabilities of TAWS to surface and low altitude situations. OBJECTIVES The electro - optical propagation objectives are: 1...development of a new propagation assessment tool called EOSTAR ( Electro - Optical Signal Transmission and Ranging). The goal of the EOSTAR project is to

  18. Research and development supporting risk-based wildfire effects prediction for fuels and fire management: Status and needs

    Treesearch

    Kevin Hyde; Matthew B. Dickinson; Gil Bohrer; David Calkin; Louisa Evers; Julie Gilbertson-Day; Tessa Nicolet; Kevin Ryan; Christina Tague

    2013-01-01

    Wildland fire management has moved beyond a singular focus on suppression, calling for wildfire management for ecological benefit where no critical human assets are at risk. Processes causing direct effects and indirect, long-term ecosystem changes are complex and multidimensional. Robust risk-assessment tools are required that account for highly variable effects on...

  19. Automated MeSH indexing of the World-Wide Web.

    PubMed Central

    Fowler, J.; Kouramajian, V.; Maram, S.; Devadhar, V.

    1995-01-01

    To facilitate networked discovery and information retrieval in the biomedical domain, we have designed a system for automatic assignment of Medical Subject Headings to documents retrieved from the World-Wide Web. Our prototype implementations show significant promise. We describe our methods and discuss the further development of a completely automated indexing tool called the "Web-MeSH Medibot." PMID:8563421

  20. SPESS: A New Instrument for Measuring Student Perceptions in Earth and Ocean Science

    ERIC Educational Resources Information Center

    Jolley, Allison; Lane, Erin; Kennedy, Ben; Frappé-Sénéclauze, Tom-Pierre

    2012-01-01

    This paper discusses the development and results of a new tool used for measuring shifts in students' perceptions of earth and ocean sciences called the Student Perceptions about Earth Sciences Survey (SPESS). The survey measures where students lie on the novice--expert continuum, and how their perceptions change after taking one or more earth and…

  1. Rapid Analysis and Manufacturing Propulsion Technology (RAMPT)

    NASA Technical Reports Server (NTRS)

    Fikes, John C.

    2018-01-01

    NASA's strategic plan calls for the development of enabling technologies, improved production methods, and advanced design and analysis tools related to the agency's objectives to expand human presence in the solar system. NASA seeks to advance exploration, science, innovation, benefits to humanity, and international collaboration, as well as facilitate and utilize U.S. commercial capabilities to deliver cargo and crew to space.

  2. Open Source Initiative Powers Real-Time Data Streams

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  3. Virtual Machine Language

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher; Page, Dennis; O'Reilly, Taifun; Fteichert, Ralph; Lock, Patricia; Lin, Imin; Naviaux, Keith; Sisino, John

    2005-01-01

    Virtual Machine Language (VML) is a mission-independent, reusable software system for programming for spacecraft operations. Features of VML include a rich set of data types, named functions, parameters, IF and WHILE control structures, polymorphism, and on-the-fly creation of spacecraft commands from calculated values. Spacecraft functions can be abstracted into named blocks that reside in files aboard the spacecraft. These named blocks accept parameters and execute in a repeatable fashion. The sizes of uplink products are minimized by the ability to call blocks that implement most of the command steps. This block approach also enables some autonomous operations aboard the spacecraft, such as aerobraking, telemetry conditional monitoring, and anomaly response, without developing autonomous flight software. Operators on the ground write blocks and command sequences in a concise, high-level, human-readable programming language (also called VML ). A compiler translates the human-readable blocks and command sequences into binary files (the operations products). The flight portion of VML interprets the uplinked binary files. The ground subsystem of VML also includes an interactive sequence- execution tool hosted on workstations, which runs sequences at several thousand times real-time speed, affords debugging, and generates reports. This tool enables iterative development of blocks and sequences within times of the order of seconds.

  4. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Rudokas, Mary R.

    1988-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Office is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control, and trend analysis of the Space Station Thermal Control System (TCS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined here with examples from the thermal system to highlight the motivating factors behind them, followed by an overview of the capabilities of MTK, which was developed to address these issues in a generic fashion.

  5. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  6. Open source bioimage informatics for cell biology

    PubMed Central

    Swedlow, Jason R.; Eliceiri, Kevin W.

    2009-01-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery. PMID:19833518

  7. A New Approach on the Long Term Dynamics of NEO's Under Yarkovsky Effect.

    NASA Astrophysics Data System (ADS)

    Peláez, Jesús; Urrutxua, Hodei; Bombardelli, Claudio; Perez-Grande, Isabel

    2011-12-01

    A classical approach to the many-body problem is that of using special perturbation methods. Nowadays and due to the availability of high-speed computers is an essential tool in Space Dynamics which exhibits a great advantage: it is applicable to any orbit involving any number of bodies and all sorts of astrodynamical problems, especially when these problems fall into regions in which general perturbation theories are absent. One such case is, for example, that Near Earth Objects (NEO's) dynamics. In this field, the Group of Tether Dynamics of UPM (GDT) has developed a new regularisation scheme - called DROMO - which is characterised by only 8 ODE. This new regularisation scheme allows a new approach to the dynamics of NEO's in the long term, specially appropriated to consider the influence of the anisotropic thermal emission (Yarkovsky and YORP effects) on the dynamics. A new project, called NEODROMO, has been started in GDT that aims to provide a reliable tool for the long term dynamics of NEO's.

  8. ART-Ada design project, phase 2

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  9. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  10. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  11. Low-energy electron beam proximity projection lithography (LEEPL): the world's first e-beam production tool, LEEPL 3000

    NASA Astrophysics Data System (ADS)

    Behringer, Uwe F. W.

    2004-06-01

    In June 2000 ago the company Accretech and LEEPL corporation decided to develop an E-beam lithography tool for high throughput wafer exposure, called LEEPL. In an amazing short time the alpha tool was built. In 2002 the beta tool was installed at Accretech. Today the first production tool the LEEPL 3000 is ready to be shipped. The 2keV E-beam tool will be used in the first lithography strategy to expose (in mix and match mode with optical exposure tools) critical levels like gate structures, contact holes (CH), and via pattern of the 90 nm and 65 nm node. At the SEMATECH EPL workshop on September 22nd in Cambridge, England it was mentioned that the amount of these levels will increase very rapidly (8 in 2007; 13 in 2010 and 17 in 2013). The schedule of the production tool for 45 nm node is mid 2005 and for the 32 nm node 2008. The Figure 1 shows from left to right α-tool, the β-tool and the production tool LEEPL 3000. Figure 1 also shows the timetable of the 4 LEEPL forum all held in Japan.

  12. Improving team information sharing with a structured call-out in anaesthetic emergencies: a randomized controlled trial.

    PubMed

    Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C

    2014-06-01

    Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Terminal Area Conflict Detection and Resolution Tool

    NASA Technical Reports Server (NTRS)

    Verma, Savita Arora

    2011-01-01

    This poster will describe analysis of a conflict detection and resolution tool for the terminal area called T-TSAFE. With altitude clearance information, the tool can reduce false alerts to as low as 2 per hour.

  14. Flight test validation of a design procedure for digital autopilots

    NASA Technical Reports Server (NTRS)

    Bryant, W. H.

    1983-01-01

    Commercially available general aviation autopilots are currently in transition from an analogue circuit system to a computer implemented digital flight control system. Well known advantages of the digital autopilot include enhanced modes, self-test capacity, fault detection, and greater computational capacity. A digital autopilot's computational capacity can be used to full advantage by increasing the sophistication of the digital autopilot's chief function, stability and control. NASA's Langley Research Center has been pursuing the development of direct digital design tools for aircraft stabilization systems for several years. This effort has most recently been directed towards the development and realization of multi-mode digital autopilots for GA aircraft, conducted under a SPIFR-related program called the General Aviation Terminal Operations Research (GATOR) Program. This presentation focuses on the implementation and testing of a candidate multi-mode autopilot designed using these newly developed tools.

  15. Holmes: a graphical tool for development, simulation and analysis of Petri net based models of complex biological systems.

    PubMed

    Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr

    2017-12-01

    Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. Auscultation

    MedlinePlus

    ... is usually done using a tool called a stethoscope. Health care providers routinely listen to a person's ... unborn infants. This can be done with a stethoscope or with sound waves (called Doppler ultrasound). Auscultation ...

  17. Development and formative evaluation of a visual e-tool to help decision makers navigate the evidence around health financing.

    PubMed

    Skordis-Worrall, Jolene; Pulkki-Brännström, Anni-Maria; Utley, Martin; Kembhavi, Gayatri; Bricki, Nouria; Dutoit, Xavier; Rosato, Mikey; Pagel, Christina

    2012-12-21

    There are calls for low and middle income countries to develop robust health financing policies to increase service coverage. However, existing evidence around financing options is complex and often difficult for policy makers to access. To summarize the evidence on the impact of financing health systems and develop an e-tool to help decision makers navigate the findings. After reviewing the literature, we used thematic analysis to summarize the impact of 7 common health financing mechanisms on 5 common health system goals. Information on the relevance of each study to a user's context was provided by 11 country indicators. A Web-based e-tool was then developed to assist users in navigating the literature review. This tool was evaluated using feedback from early users, collected using an online survey and in-depth interviews with key informants. The e-tool provides graphical summaries that allow a user to assess the following parameters with a single snapshot: the number of relevant studies available in the literature, the heterogeneity of evidence, where key evidence is lacking, and how closely the evidence matches their own context. Users particularly liked the visual display and found navigating the tool intuitive. However there was concern that a lack of evidence on positive impact might be construed as evidence against a financing option and that the tool might over-simplify the available financing options. Complex evidence can be made more easily accessible and potentially more understandable using basic Web-based technology and innovative graphical representations that match findings to the users' goals and context.

  18. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  19. Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.

    2000-01-01

    The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.

  20. 21st century toolkit for optimizing population health through precision nutrition.

    PubMed

    O'Sullivan, Aifric; Henrick, Bethany; Dixon, Bonnie; Barile, Daniela; Zivkovic, Angela; Smilowitz, Jennifer; Lemay, Danielle; Martin, William; German, J Bruce; Schaefer, Sara Elizabeth

    2017-07-05

    Scientific, technological, and economic progress over the last 100 years all but eradicated problems of widespread food shortage and nutrient deficiency in developed nations. But now society is faced with a new set of nutrition problems related to energy imbalance and metabolic disease, which require new kinds of solutions. Recent developments in the area of new analytical tools enable us to systematically study large quantities of detailed and multidimensional metabolic and health data, providing the opportunity to address current nutrition problems through an approach called Precision Nutrition. This approach integrates different kinds of "big data" to expand our understanding of the complexity and diversity of human metabolism in response to diet. With these tools, we can more fully elucidate each individual's unique phenotype, or the current state of health, as determined by the interactions among biology, environment, and behavior. The tools of precision nutrition include genomics, metabolomics, microbiomics, phenotyping, high-throughput analytical chemistry techniques, longitudinal tracking with body sensors, informatics, data science, and sophisticated educational and behavioral interventions. These tools are enabling the development of more personalized and predictive dietary guidance and interventions that have the potential to transform how the public makes food choices and greatly improve population health.

  1. ToTem: a tool for variant calling pipeline optimization.

    PubMed

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  2. Exploring New Methods of Displaying Bit-Level Quality and Other Flags for MODIS Data

    NASA Technical Reports Server (NTRS)

    Khalsa, Siri Jodha Singh; Weaver, Ron

    2003-01-01

    The NASA Distributed Active Archive Center (DAAC) at the National Snow and Ice Data Center (NSIDC) archives and distributes snow and sea ice products derived from the MODerate resolution Imaging Spectroradiometer (MODIS) on board NASA's Terra and Aqua satellites. All MODIS standard products are in the Earth Observing System version of the Hierarchal Data Format (HDF-EOS). The MODIS science team has packed a wealth of information into each HDF-EOS file. In addition to the science data arrays containing the geophysical product, there are often pixel-level Quality Assurance arrays which are important for understanding and interpreting the science data. Currently, researchers are limited in their ability to access and decode information stored as individual bits in many of the MODIS science products. Commercial and public domain utilities give users access, in varying degrees, to the elements inside MODIS HDF-EOS files. However, when attempting to visualize the data, users are confronted with the fact that many of the elements actually represent eight different 1-bit arrays packed into a single byte array. This project addressed the need for researchers to access bit-level information inside MODIS data files. In an previous NASA-funded project (ESDIS Prototype ID 50.0) we developed a visualization tool tailored to polar gridded HDF-EOS data set. This tool,called the Polar researchers to access, geolocate, visualize, and subset data that originate from different sources and have different spatial resolutions but which are placed on a common polar grid. The bit-level visualization function developed under this project was added to PHDIS, resulting in a versatile tool that serves a variety of needs. We call this the EOS Imaging Tool.

  3. DataViewer3D: An Open-Source, Cross-Platform Multi-Modal Neuroimaging Data Visualization Tool

    PubMed Central

    Gouws, André; Woods, Will; Millman, Rebecca; Morland, Antony; Green, Gary

    2008-01-01

    Integration and display of results from multiple neuroimaging modalities [e.g. magnetic resonance imaging (MRI), magnetoencephalography, EEG] relies on display of a diverse range of data within a common, defined coordinate frame. DataViewer3D (DV3D) is a multi-modal imaging data visualization tool offering a cross-platform, open-source solution to simultaneous data overlay visualization requirements of imaging studies. While DV3D is primarily a visualization tool, the package allows an analysis approach where results from one imaging modality can guide comparative analysis of another modality in a single coordinate space. DV3D is built on Python, a dynamic object-oriented programming language with support for integration of modular toolkits, and development of cross-platform software for neuroimaging. DV3D harnesses the power of the Visualization Toolkit (VTK) for two-dimensional (2D) and 3D rendering, calling VTK's low level C++ functions from Python. Users interact with data via an intuitive interface that uses Python to bind wxWidgets, which in turn calls the user's operating system dialogs and graphical user interface tools. DV3D currently supports NIfTI-1, ANALYZE™ and DICOM formats for MRI data display (including statistical data overlay). Formats for other data types are supported. The modularity of DV3D and ease of use of Python allows rapid integration of additional format support and user development. DV3D has been tested on Mac OSX, RedHat Linux and Microsoft Windows XP. DV3D is offered for free download with an extensive set of tutorial resources and example data. PMID:19352444

  4. Space transportation, satellite services, and space platforms

    NASA Technical Reports Server (NTRS)

    Disher, J. H.

    1979-01-01

    The paper takes a preview of the progressive development of vehicles for space transportation, satellite services, and orbital platforms. A low-thrust upper stage of either the ion engine or chemical type will be developed to transport large spacecraft and space platforms to and from GEO. The multimission spacecraft, space telescope, and other scientific platforms will require orbital serves going beyond that provided by the Shuttle's remote manipulator system, and plans call for extravehicular activity tools, improved remote manipulators, and a remote manned work station (the cherry picker).

  5. Characterization of Cloud Water-Content Distribution

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon

    2010-01-01

    The development of realistic cloud parameterizations for climate models requires accurate characterizations of subgrid distributions of thermodynamic variables. To this end, a software tool was developed to characterize cloud water-content distributions in climate-model sub-grid scales. This software characterizes distributions of cloud water content with respect to cloud phase, cloud type, precipitation occurrence, and geo-location using CloudSat radar measurements. It uses a statistical method called maximum likelihood estimation to estimate the probability density function of the cloud water content.

  6. A short review of variants calling for single-cell-sequencing data with applications.

    PubMed

    Wei, Zhuohui; Shu, Chang; Zhang, Changsheng; Huang, Jingying; Cai, Hongmin

    2017-11-01

    The field of single-cell sequencing is fleetly expanding, and many techniques have been developed in the past decade. With this technology, biologists can study not only the heterogeneity between two adjacent cells in the same tissue or organ, but also the evolutionary relationships and degenerative processes in a single cell. Calling variants is the main purpose in analyzing single cell sequencing (SCS) data. Currently, some popular methods used for bulk-cell-sequencing data analysis are tailored directly to be applied in dealing with SCS data. However, SCS requires an extra step of genome amplification to accumulate enough quantity for satisfying sequencing needs. The amplification yields large biases and thus raises challenge for using the bulk-cell-sequencing methods. In order to provide guidance for the development of specialized analyzed methods as well as using currently developed tools for SNS, this paper aims to bridge the gap. In this paper, we firstly introduced two popular genome amplification methods and compared their capabilities. Then we introduced a few popular models for calling single-nucleotide polymorphisms and copy-number variations. Finally, break-through applications of SNS were summarized to demonstrate its potential in researching cell evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Examining A Health Care Price Transparency Tool: Who Uses It, And How They Shop For Care.

    PubMed

    Sinaiko, Anna D; Rosenthal, Meredith B

    2016-04-01

    Calls for transparency in health care prices are increasing, in an effort to encourage and enable patients to make value-based decisions. Yet there is very little evidence of whether and how patients use health care price transparency tools. We evaluated the experiences, in the period 2011-12, of an insured population of nonelderly adults with Aetna's Member Payment Estimator, a web-based tool that provides real-time, personalized, episode-level price estimates. Overall, use of the tool increased during the study period but remained low. Nonetheless, for some procedures the number of people searching for prices of services (called searchers) was high relative to the number of people who received the service (called patients). Among Aetna patients who had an imaging service, childbirth, or one of several outpatient procedures, searchers for price information were significantly more likely to be younger and healthier and to have incurred higher annual deductible spending than patients who did not search for price information. A campaign to deliver price information to consumers may be important to increase patients' engagement with price transparency tools. Project HOPE—The People-to-People Health Foundation, Inc.

  8. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  9. Learning to Make Change Happen in Chinese Schools: Adapting a Problem-Based Computer Simulation for Developing School Leaders

    ERIC Educational Resources Information Center

    Hallinger, Philip; Shaobing, Tang; Jiafang, Lu

    2017-01-01

    School leader training has become a critical strategy in educational reform. However, in China, there still exists a big gap in terms of how to transfer leadership knowledge into practice. Thus, tools that can integrate formal knowledge into practice are called for urgently in school leader training. This paper presents the results of a research…

  10. The Math-Biology Values Instrument: Development of a Tool to Measure Life Science Majors' Task Values of Using Math in the Context of Biology

    ERIC Educational Resources Information Center

    Andrews, Sarah E.; Runyon, Christopher; Aikens, Melissa L.

    2017-01-01

    In response to calls to improve the quantitative training of undergraduate biology students, there have been increased efforts to better integrate math into biology curricula. One challenge of such efforts is negative student attitudes toward math, which are thought to be particularly prevalent among biology students. According to theory,…

  11. The Science Consistency Review A Tool To Evaluate the Use of Scientific Information in Land Management Decisionmaking

    Treesearch

    James M. Guldin; David Cawrse; Russell Graham; Miles Hemstrom; Linda Joyce; Steve Kessler; Ranotta McNair; George Peterson; Charles G. Shaw; Peter Stine; Mark Twery; Jeffrey Walter

    2003-01-01

    The paper outlines a process called the science consistency review, which can be used to evaluate the use of scientific information in land management decisions. Developed with specific reference to land management decisions in the U.S. Department of Agriculture Forest Service, the process involves assembling a team of reviewers under a review administrator to...

  12. School Subject Paradigms and Teaching Practice in the Screen Culture: Art, Music and the Mother Tongue (Swedish) under Pressure

    ERIC Educational Resources Information Center

    Erixon, Per-Olof; Marner, Anders; Scheid, Manfred; Strandberg, Tommy; Ortegren, Hans

    2012-01-01

    There are great expectations that new digital technology will become a powerful tool for developing education activities. Like many countries in Europe and worldwide, Sweden has invested a large amount of resources in new technology and new media (hereafter called digital media), and they have become a natural and important part of school…

  13. Computerized Biomechanical Man-Model

    DTIC Science & Technology

    1976-07-01

    Force Systems Command Wright-Patterson AFB, Ohio ABSTRACT The COMputerized BIomechanical MAN-Model (called COMBIMAN) is a computer interactive graphics...concept was to build a mock- The use of mock-ups for biomechanical evalua- up which permitted the designer to visualize the tion has long been a tool...of the can become an obstacle to design change. Aerospace Medical Research Laboratory, we are developing a computerized biomechanical man-model

  14. Wildland fire potential: A tool for assessing wildfire risk and fuels management needs

    Treesearch

    Greg Dillon; James Menakis; Frank Fay

    2015-01-01

    Federal wildfire managers often want to know, over large landscapes, where wildfires are likely to occur and how intense they may be. To meet this need we developed a map that we call wildland fire potential (WFP) - a raster geospatial product that can help to inform evaluations of wildfire risk or prioritization of fuels management needs across very large spatial...

  15. [Andragogy: reality or utopy].

    PubMed

    Wautier, J L; Vileyn, F

    2004-07-01

    The education of adult differs from that of children and the methods, which have to be used, should take into account that adults have specific goals and diverse knowledge. As the teaching methods for children are called pedagogy, it is now known as andragogy for adults. Andragogy has lead to the development of several approaches to improve continuous education. Several tools and methodologies have been created for adult education.

  16. Association analysis of the monoamine oxidase A gene in bipolar affective disorder by using family-based internal controls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noethen, M.M.; Eggermann, K.; Propping, P.

    1995-10-01

    It is well accepted that association studies are a major tool in investigating the contribution of single genes to the development of diseases that do not follow simple Mendelian inheritance pattern (so-called complex traits). Such major psychiatric diseases as bipolar affective disorder and schizophrenia clearly fall into this category of diseases. 7 refs., 1 tab.

  17. iTree-Hydro: Snow hydrology update for the urban forest hydrology model

    Treesearch

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2011-01-01

    This article presents snow hydrology updates made to iTree-Hydro, previously called the Urban Forest Effects—Hydrology model. iTree-Hydro Version 1 was a warm climate model developed by the USDA Forest Service to provide a process-based planning tool with robust water quantity and quality predictions given data limitations common to most urban areas. Cold climate...

  18. The Situations Bank, a Tool for Curriculum Design Focused on Daily Realities: The Case of the Reform in Niger

    ERIC Educational Resources Information Center

    Charland, Patrick; Cyr, Stéphane

    2013-01-01

    In the context of the curriculum reform in Niger, the authors describe the process of developing a situations bank which focusses on everyday life situations in Niger. The bank plays a central role in the formulation of new study programmes guided by the so-called "situated" approach. The authors also describe various issues that arose…

  19. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  20. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  1. An interactive distance solution for stroke rehabilitation in the home setting - A feasibility study.

    PubMed

    Palmcrantz, Susanne; Borg, Jörgen; Sommerfeld, Disa; Plantin, Jeanette; Wall, Anneli; Ehn, Maria; Sjölinder, Marie; Boman, Inga-Lill

    2017-09-01

    In this study an interactive distance solution (called the DISKO tool) was developed to enable home-based motor training after stroke. The overall aim was to explore the feasibility and safety of using the DISKO-tool, customized for interactive stroke rehabilitation in the home setting, in different rehabilitation phases after stroke. Fifteen patients in three different stages in the continuum of rehabilitation after stroke participated in a home-based training program using the DISKO-tool. The program included 15 training sessions with recurrent follow-ups by the integrated application for video communication with a physiotherapist. Safety and feasibility were assessed from patients, physiotherapists, and a technician using logbooks, interviews, and a questionnaire. Qualitative content analysis and descriptive statistics were used in the analysis. Fourteen out of 15 patients finalized the training period with a mean of 19.5 minutes spent on training at each session. The DISKO-tool was found to be useful and safe by patients and physiotherapists. This study demonstrates the feasibility and safety of the DISKO-tool and provides guidance in further development and testing of interactive distance technology for home rehabilitation, to be used by health care professionals and patients in different phases of rehabilitation after stroke.

  2. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  3. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    NASA Astrophysics Data System (ADS)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  4. Planetary Sample Caching System Design Options

    NASA Technical Reports Server (NTRS)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  5. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  6. ProGeRF: Proteome and Genome Repeat Finder Utilizing a Fast Parallel Hash Function

    PubMed Central

    Moraes, Walas Jhony Lopes; Rodrigues, Thiago de Souza; Bartholomeu, Daniella Castanheira

    2015-01-01

    Repetitive element sequences are adjacent, repeating patterns, also called motifs, and can be of different lengths; repetitions can involve their exact or approximate copies. They have been widely used as molecular markers in population biology. Given the sizes of sequenced genomes, various bioinformatics tools have been developed for the extraction of repetitive elements from DNA sequences. However, currently available tools do not provide options for identifying repetitive elements in the genome or proteome, displaying a user-friendly web interface, and performing-exhaustive searches. ProGeRF is a web site for extracting repetitive regions from genome and proteome sequences. It was designed to be efficient, fast, and accurate and primarily user-friendly web tool allowing many ways to view and analyse the results. ProGeRF (Proteome and Genome Repeat Finder) is freely available as a stand-alone program, from which the users can download the source code, and as a web tool. It was developed using the hash table approach to extract perfect and imperfect repetitive regions in a (multi)FASTA file, while allowing a linear time complexity. PMID:25811026

  7. A comprehensive conceptual framework for road safety strategies.

    PubMed

    Hughes, B P; Anund, A; Falkmer, T

    2016-05-01

    Road safety strategies (generally called Strategic Highway Safety Plans in the USA) provide essential guidance for actions to improve road safety, but often lack a conceptual framework that is comprehensive, systems theory based, and underpinned by evidence from research and practice. This paper aims to incorporate all components, policy tools by which they are changed, and the general interactions between them. A framework of nine mutually interacting components that contribute to crashes and ten generic policy tools which can be applied to reduce the outcomes of these crashes was developed and used to assess 58 road safety strategies from 22 countries across 15 years. The work identifies the policy tools that are most and least widely applied to components, highlighting the potential for improvements to any individual road safety strategy, and the potential strengths and weaknesses of road safety strategies in general. The framework also provides guidance for the development of new road safety strategies, identifying potential consequences of policy tool based measures with regard to exposure and risk, useful for both mobility and safety objectives. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Development of an education campaign to reduce delays in pre-hospital response to stroke.

    PubMed

    Caminiti, Caterina; Schulz, Peter; Marcomini, Barbara; Iezzi, Elisa; Riva, Silvia; Scoditti, Umberto; Zini, Andrea; Malferrari, Giovanni; Zedde, Maria Luisa; Guidetti, Donata; Montanari, Enrico; Baratti, Mario; Denti, Licia

    2017-06-24

    Systematic reviews call for well-designed trials with clearly described intervention components to support the effectiveness of educational campaigns to reduce patient delay in stroke presentation. We herein describe the systematic development process of a campaign aimed to increase stroke awareness and preparedness. Campaign development followed Intervention Mapping (IM), a theory- and evidence-based tool, and was articulated in two phases: needs assessment and intervention development. In phase 1, two cross-sectional surveys were performed, one aiming to measure stroke awareness in the target population and the other to analyze the behavioral determinants of prehospital delay. In phase 2, a matrix of proximal program objectives was developed, theory-based intervention methods and practical strategies were selected and program components and materials produced. In phase 1, the survey on 202 citizens highlighted underestimation of symptom severity, as in only 44% of stroke situations respondents would choose to call the emergency service (EMS). In the survey on 393 consecutive patients, 55% presented over 2 hours after symptom onset; major determinants were deciding to call the general practitioner first and the reaction of the first person the patient called. In phase 2, adult individuals were identified as the target of the intervention, both as potential "patients" and witnesses of stroke. The low educational level found in the patient survey called for a narrative approach in cartoon form. The family setting was chosen for the message because 42% of patients who presented within 2 hours had been advised by a family member to call EMS. To act on people's tendency to view stroke as an untreatable disease, it was decided to avoid fear-arousal appeals and use a positive message providing instructions and hope. Focus groups were used to test educational products and identify the most suitable sites for message dissemination. The IM approach allowed to develop a stroke campaign integrating theories, scientific evidence and information collected from the target population, and enabled to provide clear explanations for the reasons behind key decisions during the intervention development process. NCT01881152 . Retrospectively registered June 7 2013.

  9. Using Lean Quality Improvement Tools to Increase Delivery of Evidence-Based Tobacco Use Treatment in Hospitalized Neurosurgical Patients.

    PubMed

    Sisler, Laurel; Omofoye, Oluwaseun; Paci, Karina; Hadar, Eldad; Goldstein, Adam O; Ripley-Moffitt, Carol

    2017-12-01

    Health care providers routinely undertreat tobacco dependence, indicating a need for innovative ways to increase delivery of evidence-based care. Lean, a set of quality improvement (QI) tools used increasingly in health care, can help streamline processes, create buy-in for use of evidence-based practices, and lead to the identification of solutions on the basis of a problem's root causes. To date, no published research has examined the use of Lean tools in tobacco dependence. A 12-month QI project using Lean tools was conducted to increase delivery of evidence-based tobacco use treatment (TUT) to hospitalized neurosurgical patients. The study team developed a nicotine replacement therapy (NRT) and counseling protocol for neurosurgery inpatients who indicated current tobacco use and used Lean tools to increase protocol adherence. Rates of NRT prescription, referrals to counseling, and follow-up phone calls were compared pre- and postintervention. Secondary measures included patient satisfaction with intervention, quit rates, and reduction rates at 4 weeks postdischarge. Referrals to counseling doubled from 31.7% at baseline to 62.0% after implementation of the intervention, and rates of nicotine replacement therapy (NRT) prescriptions during hospitalization and at discharge increased from 15.3% to 28.5% and 9.0% to 19.3%, respectively. Follow-up phone call rates also dramatically increased. The majority of satisfaction survey respondents indicated that counseling had a positive or neutral impact on stress level and overall satisfaction. Lean tools can dramatically increase use of evidence-based TUT in hospitalized patients. This project is easily replicable by professionals seeking to improve delivery of tobacco treatment. These findings may be particularly helpful to inpatient surgical departments that have traditionally been reticent to prescribe NRT. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  10. The Earth Gravitational Model 1996: The NCCS: Resource for Development, Resource for the Future

    NASA Technical Reports Server (NTRS)

    2002-01-01

    For centuries, men have attempted to understand the climate system through observations obtained from Earth's surface. These observations yielded preliminary understanding of the ocean currents, tides, and prevailing winds using visual observation and simple mechanical tools as their instruments. Today's sensitive, downward-looking radar systems, called altimeters, onboard satellites can measure globally the precise height of the ocean surface. This surface is largely that of the equipotential gravity surface, called the geoid - the level surface to which the oceans would conform if there were no forces acting on them apart from gravity, as well as having a significant 1-2- meter-level signal arising from the motion of the ocean's currents.

  11. DengueTools: innovative tools and strategies for the surveillance and control of dengue

    PubMed Central

    Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R.; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane

    2012-01-01

    Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of ‘Comprehensive control of Dengue fever under changing climatic conditions’. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named ‘DengueTools’ to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change. The consortium comprises 12 work packages to address a set of research questions in three areas: Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring. Research area 2: Develop novel strategies to prevent dengue in children. Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change. In this paper, we report on the rationale and specific study objectives of ‘DengueTools’. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools. PMID:22451836

  12. Prioritization of malaria endemic zones using self-organizing maps in the Manipur state of India.

    PubMed

    Murty, Upadhyayula Suryanarayana; Srinivasa Rao, Mutheneni; Misra, Sunil

    2008-09-01

    Due to the availability of a huge amount of epidemiological and public health data that require analysis and interpretation by using appropriate mathematical tools to support the existing method to control the mosquito and mosquito-borne diseases in a more effective way, data-mining tools are used to make sense from the chaos. Using data-mining tools, one can develop predictive models, patterns, association rules, and clusters of diseases, which can help the decision-makers in controlling the diseases. This paper mainly focuses on the applications of data-mining tools that have been used for the first time to prioritize the malaria endemic regions in Manipur state by using Self Organizing Maps (SOM). The SOM results (in two-dimensional images called Kohonen maps) clearly show the visual classification of malaria endemic zones into high, medium and low in the different districts of Manipur, and will be discussed in the paper.

  13. Development and relative validation of a food frequency questionnaire for French-Canadian adolescent and young adult survivors of acute lymphoblastic leukemia.

    PubMed

    Morel, Sophia; Portolese, Olivia; Chertouk, Yasmine; Leahy, Jade; Bertout, Laurence; Laverdière, Caroline; Krajinovic, Maja; Sinnett, Daniel; Levy, Emile; Marcil, Valérie

    2018-04-21

    Survivors of childhood acute lymphoblastic leukemia (cALL) experience cardiometabolic and bone complications after treatments. This study aimed at developing and validating an interview-administrated food frequency questionnaire (FFQ) that will serve to estimate the impact of nutrition in the development of long-term sequalea of French-Canadian cALL survivors. The FFQ was developed to assess habitual diet, Mediterranean diet score, nutrients promoting bone health and antioxidants. It was validated using a 3-day food record (3-DFR) in 80 cALL survivors (50% male) aged between 11.4 and 40.1 years (median of 18.0 years). Reproducibility was evaluated by comparing FFQs from visit 1 and 2 in 29 cALL survivors. When compared to 3-DFR, the mean values for macro- and micronutrient intake were overestimated by our FFQ with the exception of lipid-related nutrients. Correlations between nutrient intakes derived from the FFQs and the 3-DFRs showed moderate to very good correlations (0.46-0.74). Intraclass correlation coefficients assessing FFQ reproducibility ranged from 0.62 to 0.92, indicating moderate to good reliability. Furthermore, classification into quartiles showed more than 75% of macro- and micronutrients derived from FFQs 1 and 2 classified into the same or adjacent quartile. Overall, our results support the reproducibility and accuracy of the developed FFQ to appropriately classify individuals according to their dietary intake. This validated tool will be valuable for future studies analyzing the impact of nutrition on cardiometabolic and bone complications in French-speaking populations.

  14. Using telephony data to facilitate discovery of clinical workflows.

    PubMed

    Rucker, Donald W

    2017-04-19

    Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  15. Spinoff from a Moon Tool

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Portable self-contained drill capable of extracting core samples as much as 10 feet below the surface was needed for the astronauts. Black & Decker used a specially developed computer program to optimize the design of the drill's motor and insure minimal power consumption. Refinement of the original technology led to the development of a cordless miniature vacuum cleaner called the Dustbuster. It has no hose, no cord, is 14 inches long, and also comes with a storage bracket that also serves as a recharger; plugs into a home outlet that charges the nickel cadmium batteries when not in use. Other home use cordless instruments include drills, shrub trimmers and grass shears. Company also manufactures a number of cordless tools used in the sheet metal automobile and construction industries, and a line of cordless orthopedic instruments.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pardo-Bosch, Francesc, E-mail: francesc.pardo@upc.edu; Political Science Department, University of California - Berkeley; Aguado, Antonio, E-mail: antonio.aguado@upc.edu

    Infrastructure construction, one of the biggest driving forces of the economy nowadays, requires a huge analysis and clear transparency to decide what projects have to be executed with the few resources available. With the aim to provide the public administrations a tool with which they can make their decisions easier, the Sustainability Index of Infrastructure Projects (SIIP) has been defined, with a multi-criteria decision system called MIVES, in order to classify non-uniform investments. This index evaluates, in two inseparable stages, the contribution to the sustainable development of each infrastructure project, analyzing its social, environmental and economic impact. The result ofmore » the SIIP allows to decide the order with which projects will be prioritized. The case of study developed proves the adaptability and utility of this tool for the ordinary budget management.« less

  17. CNTRO: A Semantic Web Ontology for Temporal Relation Inferencing in Clinical Narratives.

    PubMed

    Tao, Cui; Wei, Wei-Qi; Solbrig, Harold R; Savova, Guergana; Chute, Christopher G

    2010-11-13

    Using Semantic-Web specifications to represent temporal information in clinical narratives is an important step for temporal reasoning and answering time-oriented queries. Existing temporal models are either not compatible with the powerful reasoning tools developed for the Semantic Web, or designed only for structured clinical data and therefore are not ready to be applied on natural-language-based clinical narrative reports directly. We have developed a Semantic-Web ontology which is called Clinical Narrative Temporal Relation ontology. Using this ontology, temporal information in clinical narratives can be represented as RDF (Resource Description Framework) triples. More temporal information and relations can then be inferred by Semantic-Web based reasoning tools. Experimental results show that this ontology can represent temporal information in real clinical narratives successfully.

  18. Interactive intelligent remote operations: application to space robotics

    NASA Astrophysics Data System (ADS)

    Dupuis, Erick; Gillett, G. R.; Boulanger, Pierre; Edwards, Eric; Lipsett, Michael G.

    1999-11-01

    A set of tolls addressing the problems specific to the control and monitoring of remote robotic systems from extreme distances has been developed. The tools include the capability to model and visualize the remote environment, to generate and edit complex task scripts, to execute the scripts to supervisory control mode and to monitor and diagnostic equipment from multiple remote locations. Two prototype systems are implemented for demonstration. The first demonstration, using a prototype joint design called Dexter, shows the applicability of the approach to space robotic operation in low Earth orbit. The second demonstration uses a remotely controlled excavator in an operational open-pit tar sand mine. This demonstrates that the tools developed can also be used for planetary exploration operations as well as for terrestrial mining applications.

  19. Modeling crime events by d-separation method

    NASA Astrophysics Data System (ADS)

    Aarthee, R.; Ezhilmaran, D.

    2017-11-01

    Problematic legal cases have recently called for a scientifically founded method of dealing with the qualitative and quantitative roles of evidence in a case [1].To deal with quantitative, we proposed a d-separation method for modeling the crime events. A d-separation is a graphical criterion for identifying independence in a directed acyclic graph. By developing a d-separation method, we aim to lay the foundations for the development of a software support tool that can deal with the evidential reasoning in legal cases. Such a tool is meant to be used by a judge or juror, in alliance with various experts who can provide information about the details. This will hopefully improve the communication between judges or jurors and experts. The proposed method used to uncover more valid independencies than any other graphical criterion.

  20. Efficient prediction of human protein-protein interactions at a global scale.

    PubMed

    Schoenrock, Andrew; Samanfar, Bahram; Pitre, Sylvain; Hooshyar, Mohsen; Jin, Ke; Phillips, Charles A; Wang, Hui; Phanse, Sadhna; Omidi, Katayoun; Gui, Yuan; Alamgir, Md; Wong, Alex; Barrenäs, Fredrik; Babu, Mohan; Benson, Mikael; Langston, Michael A; Green, James R; Dehne, Frank; Golshani, Ashkan

    2014-12-10

    Our knowledge of global protein-protein interaction (PPI) networks in complex organisms such as humans is hindered by technical limitations of current methods. On the basis of short co-occurring polypeptide regions, we developed a tool called MP-PIPE capable of predicting a global human PPI network within 3 months. With a recall of 23% at a precision of 82.1%, we predicted 172,132 putative PPIs. We demonstrate the usefulness of these predictions through a range of experiments. The speed and accuracy associated with MP-PIPE can make this a potential tool to study individual human PPI networks (from genomic sequences alone) for personalized medicine.

  1. Development of a prototype commonality analysis tool for use in space programs

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1988-01-01

    A software tool to aid in performing commonality analyses, called Commonality Analysis Problem Solver (CAPS), was designed, and a prototype version (CAPS 1.0) was implemented and tested. The CAPS 1.0 runs in an MS-DOS or IBM PC-DOS environment. The CAPS is designed around a simple input language which provides a natural syntax for the description of feasibility constraints. It provides its users with the ability to load a database representing a set of design items, describe the feasibility constraints on items in that database, and do a comprehensive cost analysis to find the most economical substitution pattern.

  2. Tool Measures Depths of Defects on a Case Tang Joint

    NASA Technical Reports Server (NTRS)

    Ream, M. Bryan; Montgomery, Ronald B.; Mecham, Brent A.; Keirstead, Bums W.

    2005-01-01

    A special-purpose tool has been developed for measuring the depths of defects on an O-ring seal surface. The surface lies in a specially shaped ringlike fitting, called a capture feature tang, located on an end of a cylindrical segment of a case that contains a solid-fuel booster rocket motor for launching a space shuttle. The capture feature tang is a part of a tang-and-clevis, O-ring joint between the case segment and a similar, adjacent cylindrical case segment. When the segments are joined, the tang makes an interference fit with the clevis and squeezes the O-ring at the side of the gap.

  3. Scientific crowdsourcing in wildlife research and conservation: Tigers (Panthera tigris) as a case study.

    PubMed

    Can, Özgün Emre; D'Cruze, Neil; Balaskas, Margaret; Macdonald, David W

    2017-03-01

    With around 3,200 tigers (Panthera tigris) left in the wild, the governments of 13 tiger range countries recently declared that there is a need for innovation to aid tiger research and conservation. In response to this call, we created the "Think for Tigers" study to explore whether crowdsourcing has the potential to innovate the way researchers and practitioners monitor tigers in the wild. The study demonstrated that the benefits of crowdsourcing are not restricted only to harnessing the time, labor, and funds from the public but can also be used as a tool to harness creative thinking that can contribute to development of new research tools and approaches. Based on our experience, we make practical recommendations for designing a crowdsourcing initiative as a tool for generating ideas.

  4. Soldering Tool for Integrated Circuits

    NASA Technical Reports Server (NTRS)

    Takahashi, Ted H.

    1987-01-01

    Many connections soldered simultaneously in confined spaces. Improved soldering tool bonds integrated circuits onto printed-circuit boards. Intended especially for use with so-called "leadless-carrier" integrated circuits.

  5. An evaluation of copy number variation detection tools for cancer using whole exome sequencing data.

    PubMed

    Zare, Fatima; Dow, Michelle; Monteleone, Nicholas; Hosny, Abdelrahman; Nabavi, Sheida

    2017-05-31

    Recently copy number variation (CNV) has gained considerable interest as a type of genomic/genetic variation that plays an important role in disease susceptibility. Advances in sequencing technology have created an opportunity for detecting CNVs more accurately. Recently whole exome sequencing (WES) has become primary strategy for sequencing patient samples and study their genomics aberrations. However, compared to whole genome sequencing, WES introduces more biases and noise that make CNV detection very challenging. Additionally, tumors' complexity makes the detection of cancer specific CNVs even more difficult. Although many CNV detection tools have been developed since introducing NGS data, there are few tools for somatic CNV detection for WES data in cancer. In this study, we evaluated the performance of the most recent and commonly used CNV detection tools for WES data in cancer to address their limitations and provide guidelines for developing new ones. We focused on the tools that have been designed or have the ability to detect cancer somatic aberrations. We compared the performance of the tools in terms of sensitivity and false discovery rate (FDR) using real data and simulated data. Comparative analysis of the results of the tools showed that there is a low consensus among the tools in calling CNVs. Using real data, tools show moderate sensitivity (~50% - ~80%), fair specificity (~70% - ~94%) and poor FDRs (~27% - ~60%). Also, using simulated data we observed that increasing the coverage more than 10× in exonic regions does not improve the detection power of the tools significantly. The limited performance of the current CNV detection tools for WES data in cancer indicates the need for developing more efficient and precise CNV detection methods. Due to the complexity of tumors and high level of noise and biases in WES data, employing advanced novel segmentation, normalization and de-noising techniques that are designed specifically for cancer data is necessary. Also, CNV detection development suffers from the lack of a gold standard for performance evaluation. Finally, developing tools with user-friendly user interfaces and visualization features can enhance CNV studies for a broader range of users.

  6. Validation of a Pressure-Based Combustion Simulation Tool Using a Single Element Injector Test Problem

    NASA Technical Reports Server (NTRS)

    Thakur, Siddarth; Wright, Jeffrey

    2006-01-01

    The traditional design and analysis practice for advanced propulsion systems, particularly chemical rocket engines, relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment by non-CFD specialists. A computational tool, called Loci-STREAM is being developed for this purpose. It is a pressure-based, Reynolds-averaged Navier-Stokes (RANS) solver for generalized unstructured grids, which is designed to handle all-speed flows (incompressible to hypersonic) and is particularly suitable for solving multi-species flow in fixed-frame combustion devices. Loci-STREAM integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective of the ongoing work is to develop a robust simulation capability for combustion problems in rocket engines. As an initial step towards validating this capability, a model problem is investigated in the present study which involves a gaseous oxygen/gaseous hydrogen (GO2/GH2) shear coaxial single element injector, for which experimental data are available. The sensitivity of the computed solutions to grid density, grid distribution, different turbulence models, and different near-wall treatments is investigated. A refined grid, which is clustered in the vicinity of the solid walls as well as the flame, is used to obtain a steady state solution which may be considered as the best solution attainable with the steady-state RANS methodology. From a design point of view, quick turnaround times are desirable; with this in mind, coarser grids are also employed and the resulting solutions are evaluated with respect to the fine grid solution.

  7. Development, features and application of DIET ASSESS & PLAN (DAP) software in supporting public health nutrition research in Central Eastern European Countries (CEEC).

    PubMed

    Gurinović, Mirjana; Milešević, Jelena; Kadvan, Agnes; Nikolić, Marina; Zeković, Milica; Djekić-Ivanković, Marija; Dupouy, Eleonora; Finglas, Paul; Glibetić, Maria

    2018-01-01

    In order to meet growing public health nutrition challenges in Central Eastern European Countries (CEEC) and Balkan countries, development of a Research Infrastructure (RI) and availability of an effective nutrition surveillance system are a prerequisite. The building block of this RI is an innovative tool called DIET ASSESS & PLAN (DAP), which is a platform for standardized and harmonized food consumption collection, comprehensive dietary intake assessment and nutrition planning. Its unique structure enables application of national food composition databases (FCDBs) from the European food composition exchange platform (28 national FCDBs) developed by EuroFIR (http://www.eurofir.org/) and in addition allows communication with other tools. DAP is used for daily menu and/or long-term diet planning in diverse public sector settings, foods design/reformulation, food labelling, nutrient intake assessment and calculation of the dietary diversity indicator, Minimum Dietary Diversity-Women (MDD-W). As a validated tool in different national and international projects, DAP represents an important RI in public health nutrition epidemiology in the CEEC region. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Transputer parallel processing at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1989-01-01

    The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.

  9. A call center primer.

    PubMed

    Durr, W

    1998-01-01

    Call centers are strategically and tactically important to many industries, including the healthcare industry. Call centers play a key role in acquiring and retaining customers. The ability to deliver high-quality and timely customer service without much expense is the basis for the proliferation and expansion of call centers. Call centers are unique blends of people and technology, where performance indicates combining appropriate technology tools with sound management practices built on key operational data. While the technology is fascinating, the people working in call centers and the skill of the management team ultimately make a difference to their companies.

  10. Ocean Drilling Program: TAMRF Administrative Services: Meeting, Travel, and

    Science.gov Websites

    Port-Call Information ODP/TAMU Science Operator Home Mirror sites ODP/TAMU staff Cruise information Science and curation services Publication services and products Drilling services and tools Online ODP Meeting, Travel, and Port-Call Information All ODP meeting and port-call activities are complete

  11. Marketing, Management and Performance: Multilingualism as Commodity in a Tourism Call Centre

    ERIC Educational Resources Information Center

    Duchene, Alexandre

    2009-01-01

    This paper focuses on the ways an institution of the new economy--a tourism call centre in Switzerland--markets, manages and performs multilingual services. In particular, it explores the ways multilingualism operates as a strategic and managerial tool within tourism call centres and how the institutional regulation of language practices…

  12. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. The BioCyc collection of microbial genomes and metabolic pathways.

    PubMed

    Karp, Peter D; Billington, Richard; Caspi, Ron; Fulcher, Carol A; Latendresse, Mario; Kothari, Anamika; Keseler, Ingrid M; Krummenacker, Markus; Midford, Peter E; Ong, Quang; Ong, Wai Kit; Paley, Suzanne M; Subhraveti, Pallavi

    2017-08-17

    BioCyc.org is a microbial genome Web portal that combines thousands of genomes with additional information inferred by computer programs, imported from other databases and curated from the biomedical literature by biologist curators. BioCyc also provides an extensive range of query tools, visualization services and analysis software. Recent advances in BioCyc include an expansion in the content of BioCyc in terms of both the number of genomes and the types of information available for each genome; an expansion in the amount of curated content within BioCyc; and new developments in the BioCyc software tools including redesigned gene/protein pages and metabolite pages; new search tools; a new sequence-alignment tool; a new tool for visualizing groups of related metabolic pathways; and a facility called SmartTables, which enables biologists to perform analyses that previously would have required a programmer's assistance. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. "Plasmo2D": an ancillary proteomic tool to aid identification of proteins from Plasmodium falciparum.

    PubMed

    Khachane, Amit; Kumar, Ranjit; Jain, Sanyam; Jain, Samta; Banumathy, Gowrishankar; Singh, Varsha; Nagpal, Saurabh; Tatu, Utpal

    2005-01-01

    Bioinformatics tools to aid gene and protein sequence analysis have become an integral part of biology in the post-genomic era. Release of the Plasmodium falciparum genome sequence has allowed biologists to define the gene and the predicted protein content as well as their sequences in the parasite. Using pI and molecular weight as characteristics unique to each protein, we have developed a bioinformatics tool to aid identification of proteins from Plasmodium falciparum. The tool makes use of a Virtual 2-DE generated by plotting all of the proteins from the Plasmodium database on a pI versus molecular weight scale. Proteins are identified by comparing the position of migration of desired protein spots from an experimental 2-DE and that on a virtual 2-DE. The procedure has been automated in the form of user-friendly software called "Plasmo2D". The tool can be downloaded from http://144.16.89.25/Plasmo2D.zip.

  15. REDO: RNA Editing Detection in Plant Organelles Based on Variant Calling Results.

    PubMed

    Wu, Shuangyang; Liu, Wanfei; Aljohi, Hasan Awad; Alromaih, Sarah A; Alanazi, Ibrahim O; Lin, Qiang; Yu, Jun; Hu, Songnian

    2018-05-01

    RNA editing is a post-transcriptional or cotranscriptional process that changes the sequence of the precursor transcript by substitutions, insertions, or deletions. Almost all of the land plants undergo RNA editing in organelles (plastids and mitochondria). Although several software tools have been developed to identify RNA editing events, there has been a great challenge to distinguish true RNA editing events from genome variation, sequencing errors, and other factors. Here we introduce REDO, a comprehensive application tool for identifying RNA editing events in plant organelles based on variant call format files from RNA-sequencing data. REDO is a suite of Perl scripts that illustrate a bunch of attributes of RNA editing events in figures and tables. REDO can also detect RNA editing events in multiple samples simultaneously and identify the significant differential proportion of RNA editing loci. Comparing with similar tools, such as REDItools, REDO runs faster with higher accuracy, and more specificity at the cost of slightly lower sensitivity. Moreover, REDO annotates each RNA editing site in RNAs, whereas REDItools reports only possible RNA editing sites in genome, which need additional steps to obtain RNA editing profiles for RNAs. Overall, REDO can identify potential RNA editing sites easily and provide several functions such as detailed annotations, statistics, figures, and significantly differential proportion of RNA editing sites among different samples.

  16. Thinking about Pregnancy After Premature Birth

    MedlinePlus

    ... Moms Need Blog News & Media News Videos Mission stories Ambassadors Spotlights Tools & Resources Frequently asked media questions ... a kind of fertility treatment called assisted reproductive technology (also called ART). Fertility treatment is medical treatment ...

  17. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  18. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    NASA Technical Reports Server (NTRS)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  19. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  20. Nutrition environment measures survey-vending: development, dissemination, and reliability.

    PubMed

    Voss, Carol; Klein, Susan; Glanz, Karen; Clawson, Margaret

    2012-07-01

    Researchers determined a need to develop an instrument to assess the vending machine environment that was comparably reliable and valid to other Nutrition Environment Measures Survey tools and that would provide consistent and comparable data for businesses, schools, and communities. Tool development, reliability testing, and dissemination of the Nutrition Environment Measures Survey-Vending (NEMS-V) involved a collaboration of students, professionals, and community leaders. Interrater reliability testing showed high levels of agreement among trained raters on the products and evaluations of products. NEMS-V can benefit public health partners implementing policy and environmental change initiatives as a part of their community wellness activities. The vending machine project will support a policy calling for state facilities to provide a minimum of 30% of foods and beverages in vending machines as healthy options, based on NEMS-V criteria, which will be used as a model for other businesses.

  1. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  2. THE HUMAN BEHAVIOR RATING SCALE-BRIEF: A TOOL TO MEASURE 21ST CENTURY SKILLS OF K-12 LEARNERS.

    PubMed

    Woods-Groves, Suzanne

    2015-06-01

    Currently there is a call for brief concise measurements to appraise relevant 21st century college readiness skills in K-12 learners. This study employed K-12 teachers' ratings for over 3,000 students for an existing 91-item rating scale, the Human Behavior Rating Scale, that measured the 21st century skills of persistence, curiosity, externalizing affect, internalizing affect, and cognition. Teachers' ratings for K-12 learners were used to develop a brief, concise, and manageable 30-item tool, the Human Behavior Rating Scale-Brief. Results yielded high internal consistency coefficients and inter-item correlations. The items were not biased with regard to student sex or race, and were supported through confirmatory factor analyses. In addition, when teachers' ratings were compared with students' academic and behavioral performance data, moderate to strong relationships were revealed. This study provided an essential first step in the development of a psychometrically sound, manageable, and brief tool to appraise 21st century skills in K-12 learners.

  3. Novel inter and intra prediction tools under consideration for the emerging AV1 video codec

    NASA Astrophysics Data System (ADS)

    Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil

    2017-09-01

    Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.

  4. A flexible tool for diagnosing water, energy, and entropy budgets in climate models

    NASA Astrophysics Data System (ADS)

    Lembo, Valerio; Lucarini, Valerio

    2017-04-01

    We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.

  5. OCSEGen: Open Components and Systems Environment Generator

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana

    2014-01-01

    To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.

  6. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  7. Semantic Importance Sampling for Statistical Model Checking

    DTIC Science & Technology

    2015-01-16

    SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS

  8. Comparative Analysis of CNV Calling Algorithms: Literature Survey and a Case Study Using Bovine High-Density SNP Data.

    PubMed

    Xu, Lingyang; Hou, Yali; Bickhart, Derek M; Song, Jiuzhou; Liu, George E

    2013-06-25

    Copy number variations (CNVs) are gains and losses of genomic sequence between two individuals of a species when compared to a reference genome. The data from single nucleotide polymorphism (SNP) microarrays are now routinely used for genotyping, but they also can be utilized for copy number detection. Substantial progress has been made in array design and CNV calling algorithms and at least 10 comparison studies in humans have been published to assess them. In this review, we first survey the literature on existing microarray platforms and CNV calling algorithms. We then examine a number of CNV calling tools to evaluate their impacts using bovine high-density SNP data. Large incongruities in the results from different CNV calling tools highlight the need for standardizing array data collection, quality assessment and experimental validation. Only after careful experimental design and rigorous data filtering can the impacts of CNVs on both normal phenotypic variability and disease susceptibility be fully revealed.

  9. Mobile phone call data as a regional socio-economic proxy indicator.

    PubMed

    Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K; Ylä-Jääski, Antti

    2015-01-01

    The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's.

  10. The theory of interface slicing

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a new tool which was developed to facilitate reuse-based software engineering, by addressing the following problems, needs, and issues: (1) size of systems incorporating reused modules; (2) knowledge requirements for program modification; (3) program understanding for reverse engineering; (4) module granularity and domain management; and (5) time and space complexity of conventional slicing. The definition of a form of static program analysis called interface slicing is addressed.

  11. Impact of Gender, Ethnicity, Year in School, Social Economic Status, and State Standardized Assessment Scores on Student Content Knowledge Achievement when Using Vee Maps as a Formative Assessment Tool

    ERIC Educational Resources Information Center

    Thoron, Andrew C.; Myers, Brian E.

    2011-01-01

    The National Research Council has recognized the challenge of assessing laboratory investigation and called for the investigation of assessments that are proven through sound research-based studies. The Vee map provides a framework that allows the learners to conceptualize their previous knowledge as they develop success in meaningful learning…

  12. Digitizing for Computer-Aided Finite Element Model Generation.

    DTIC Science & Technology

    1979-10-10

    this approach is a collection of programs developed over the last eight years at the University of Arizona, and called the GIFTS system. This paper...briefly describes the latest version of the system, GIFTS -5, and demonstrates its suitability in a design environment by simple examples. The programs...constituting the GIFTS system were used as a tool for research in many areas, including mesh generation, finite element data base design, interactive

  13. Stress management standards: a warning indicator for employee health.

    PubMed

    Kazi, A; Haslam, C O

    2013-07-01

    Psychological stress is a major cause of lost working days in the UK. The Health & Safety Executive (HSE) has developed management standards (MS) to help organizations to assess work-related stress. To investigate the relationships between the MS indicator tool and employee health, job attitudes, work performance and environmental outcomes. The first phase involved a survey employing the MS indicator tool, General Health Questionnaire-12 (GHQ-12), job attitudes, work performance and environmental measures in a call centre from a large utility company. The second phase comprised six focus groups to investigate what employees believed contributed to their perceived stress. Three hundred and four call centre employees responded with a response rate of 85%. Significant negative correlations were found between GHQ-12 and two MS dimensions; demands (Rho = -0.211, P < 0.001) and relationships (Rho= -0.134, P < 0.05). Other dimensions showed no significant relationship with GHQ-12. Higher levels of stress were associated with reduced job performance, job motivation and increased intention to quit but low stress levels were associated with reduced job satisfaction. Lack of management support, recognition and development opportunities were identified as sources of stress. The findings support the utility of the MS as a measure of employee attitudes and performance.

  14. Developing a Web-Based Ppgis, as AN Environmental Reporting Service

    NASA Astrophysics Data System (ADS)

    Ranjbar Nooshery, N.; Taleai, M.; Kazemi, R.; Ebadi, K.

    2017-09-01

    Today municipalities are searching for new tools to empower locals for changing the future of their own areas by increasing their participation in different levels of urban planning. These tools should involve the community in planning process using participatory approaches instead of long traditional top-down planning models and help municipalities to obtain proper insight about major problems of urban neighborhoods from the residents' point of view. In this matter, public participation GIS (PPGIS) which enables citizens to record and following up their feeling and spatial knowledge regarding problems of the city in the form of maps have been introduced. In this research, a tool entitled CAER (Collecting & Analyzing of Environmental Reports) is developed. In the first step, a software framework based on Web-GIS tool, called EPGIS (Environmental Participatory GIS) has been designed to support public participation in reporting urban environmental problems and to facilitate data flow between citizens and municipality. A web-based cartography tool was employed for geo-visualization and dissemination of map-based reports. In the second step of CAER, a subsystem is developed based on SOLAP (Spatial On-Line Analytical Processing), as a data mining tools to elicit the local knowledge facilitating bottom-up urban planning practices and to help urban managers to find hidden relations among the recorded reports. This system is implemented in a case study area in Boston, Massachusetts and its usability was evaluated. The CAER should be considered as bottom-up planning tools to collect people's problems and views about their neighborhood and transmits them to the city officials. It also helps urban planners to find solutions for better management from citizen's viewpoint and gives them this chance to develop good plans to the neighborhoods that should be satisfied the citizens.

  15. On the design of computer-based models for integrated environmental science.

    PubMed

    McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick

    2005-06-01

    The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.

  16. Development and Exploration of a Regional Stormwater BMP Performance Database to Parameterize an Integrated Decision Support Tool (i-DST)

    NASA Astrophysics Data System (ADS)

    Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.

    2017-12-01

    Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.

  17. Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.

    PubMed

    Bae, Jong-Myon; Kim, Eun Hee

    2016-03-01

    The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.

  18. Deformable complex network for refining low-resolution X-ray structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chong; Wang, Qinghua; Ma, Jianpeng, E-mail: jpma@bcm.edu

    2015-10-27

    A new refinement algorithm called the deformable complex network that combines a novel angular network-based restraint with a deformable elastic network model in the target function has been developed to aid in structural refinement in macromolecular X-ray crystallography. In macromolecular X-ray crystallography, building more accurate atomic models based on lower resolution experimental diffraction data remains a great challenge. Previous studies have used a deformable elastic network (DEN) model to aid in low-resolution structural refinement. In this study, the development of a new refinement algorithm called the deformable complex network (DCN) is reported that combines a novel angular network-based restraint withmore » the DEN model in the target function. Testing of DCN on a wide range of low-resolution structures demonstrated that it constantly leads to significantly improved structural models as judged by multiple refinement criteria, thus representing a new effective refinement tool for low-resolution structural determination.« less

  19. Wet Lab Accelerator: A Web-Based Application Democratizing Laboratory Automation for Synthetic Biology.

    PubMed

    Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S

    2017-01-20

    Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.

  20. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  1. A decision analysis tool for the assessment of posterior fossa tumour surgery outcomes in children--the "Liverpool Neurosurgical Complication Causality Assessment Tool".

    PubMed

    Zakaria, Rasheed; Ellenbogen, Jonathan; Graham, Catherine; Pizer, Barry; Mallucci, Conor; Kumar, Ram

    2013-08-01

    Complications may occur following posterior fossa tumour surgery in children. Such complications are subjectively and inconsistently reported even though they may have significant long-term behavioural and cognitive consequences for the child. This makes comparison of surgeons, programmes and treatments problematic. We have devised a causality tool for assessing if an adverse event after surgery can be classified as a surgical complication using a series of simple questions, based on a tool used in assessing adverse drug reactions. This tool, which we have called the "Liverpool Neurosurgical Complication Causality Assessment Tool", was developed by reviewing a series of ten posterior fossa tumour cases with a panel of neurosurgery, neurology, oncology and neuropsychology specialists working in a multidisciplinary paediatric tumour treatment programme. We have demonstrated its use and hope that it may improve reliability between different assessors both in evaluating the outcomes of existing programmes and treatments as well as aiding in trials which may directly compare the effects of surgical and medical treatments.

  2. VCFR: A package to manipulate and visualize variant call format data in R

    USDA-ARS?s Scientific Manuscript database

    Software to call single nucleotide polymorphisms or related genetic variants has converged on the variant call format (vcf) as their output format of choice. This has created a need for tools to work with vcf files. While an increasing number of software exists to read vcf data, many of them only ex...

  3. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  4. A survey on annotation tools for the biomedical literature.

    PubMed

    Neves, Mariana; Leser, Ulf

    2014-03-01

    New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.

  5. Use of ecological momentary assessment to determine which structural factors impact perceived teaching quality of attending rounds.

    PubMed

    Willett, Lisa; Houston, Thomas K; Heudebert, Gustavo R; Estrada, Carlos

    2012-09-01

    Providing high-quality teaching to residents during attending rounds is challenging. Reasons include structural factors that affect rounds, which are beyond the attending's teaching style and control. To develop a new evaluation tool to identify the structural components of ward rounds that most affect teaching quality in an internal medicine (IM) residency program. The authors developed a 10-item Ecological Momentary Assessment (EMA) tool and collected daily evaluations for 18 months from IM residents rotating on inpatient services. Residents ranked the quality of teaching on rounds that day, and questions related to their service (general medicine, medical intensive care unit, and subspecialty services), patient census, absenteeism of team members, call status, and number of teaching methods used by the attending. Residents completed 488 evaluation cards over 18 months. This found no association between perceived teaching quality and training level, team absenteeism, and call status. We observed differences by service (P < .001) and patient census (P  =  .009). After adjusting for type of service, census was no longer significant. Use of a larger variety of teaching methods was associated with higher perceived teaching quality, regardless of service or census (P for trend < .001). The EMA tool successfully identified that higher patient census was associated with lower perceived teaching quality, but the results were also influenced by the type of teaching service. We found that, regardless of census or teaching service, attendings can improve their teaching by diversifying the number of methods used in daily rounds.

  6. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less

  7. Electrophysiological signal analysis and visualization using Cloudwave for epilepsy clinical research.

    PubMed

    Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S

    2013-01-01

    Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Electrophysiological data recordings, such as electroencephalogram (EEG), are the gold standard for diagnosis and pre-surgical evaluation in epilepsy patients. The increasing trend towards multi-center clinical studies require signal visualization and analysis tools to support real time interaction with signal data in a collaborative environment, which cannot be supported by traditional desktop-based standalone applications. As part of the Prevention and Risk Identification of SUDEP Mortality (PRISM) project, we have developed a Web-based electrophysiology data visualization and analysis platform called Cloudwave using highly scalable open source cloud computing infrastructure. Cloudwave is integrated with the PRISM patient cohort identification tool called MEDCIS (Multi-modality Epilepsy Data Capture and Integration System). The Epilepsy and Seizure Ontology (EpSO) underpins both Cloudwave and MEDCIS to support query composition and result retrieval. Cloudwave is being used by clinicians and research staff at the University Hospital - Case Medical Center (UH-CMC) Epilepsy Monitoring Unit (EMU) and will be progressively deployed at four EMUs in the United States and the United Kingdomas part of the PRISM project.

  8. Real-time access of large volume imagery through low-bandwidth links

    NASA Astrophysics Data System (ADS)

    Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew

    2010-04-01

    Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.

  9. Extreme Weather Events and Interconnected Infrastructures: Toward More Comprehensive Climate Change Planning [Meeting challenges in understanding impacts of extreme weather events on connected infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilbanks, Thomas J.; Fernandez, Steven J.; Allen, Melissa R.

    The President s Climate Change Action Plan calls for the development of better science, data, and tools for climate preparedness. Many of the current questions about preparedness for extreme weather events in coming decades are, however, difficult to answer with assets that have been developed by climate science to answer longer-term questions about climate change. Capacities for projecting exposures to climate-related extreme events, along with their implications for interconnected infrastructures, are now emerging.

  10. Extreme Weather Events and Interconnected Infrastructures: Toward More Comprehensive Climate Change Planning [Meeting challenges in understanding impacts of extreme weather events on connected infrastructures

    DOE PAGES

    Wilbanks, Thomas J.; Fernandez, Steven J.; Allen, Melissa R.

    2015-06-23

    The President s Climate Change Action Plan calls for the development of better science, data, and tools for climate preparedness. Many of the current questions about preparedness for extreme weather events in coming decades are, however, difficult to answer with assets that have been developed by climate science to answer longer-term questions about climate change. Capacities for projecting exposures to climate-related extreme events, along with their implications for interconnected infrastructures, are now emerging.

  11. Recent developments in the design and verification of crystalline polarization scramblers for space applications

    NASA Astrophysics Data System (ADS)

    Dubroca, Guilhem; Richert, Michaël.; Loiseaux, Didier; Caron, Jérôme; Bézy, Jean-Loup

    2015-09-01

    To increase the accuracy of earth-observation spectro-imagers, it is necessary to achieve high levels of depolarization of the incoming beam. The preferred device in space instrument is the so-called polarization scrambler. It is made of birefringent crystal wedges arranged in a single or dual Babinet. Today, with required radiometric accuracies of the order of 0.1%, it is necessary to develop tools to find optimal and low sensitivity solutions quickly and to measure the performances with a high level of accuracy.

  12. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  13. Turbine Manufacture

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The machinery pictured is a set of Turbodyne steam turbines which power a sugar mill at Bell Glade, Florida. A NASA-developed computer program called NASTRAN aided development of these and other turbines manufactured by Turbodyne Corporation's Steam Turbine Division, Wellsville, New York. An acronym for NASA Structural Analysis Program, NASTRAN is a predictive tool which advises development teams how a structural design will perform under service use conditions. Turbodyne uses NASTRAN to analyze the dynamic behavior of steam turbine components, achieving substantial savings in development costs. One of the most widely used spinoffs, NASTRAN is made available to private industry through NASA's Computer Software Management Information Center (COSMIC) at the University of Georgia.

  14. Simulink/PARS Integration Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, B.; Nakhaee, N.

    2013-12-18

    The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less

  15. A software tool for determination of breast cancer treatment methods using data mining approach.

    PubMed

    Cakır, Abdülkadir; Demirel, Burçin

    2011-12-01

    In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.

  16. PIRIA: a general tool for indexing, search, and retrieval of multimedia content

    NASA Astrophysics Data System (ADS)

    Joint, Magali; Moellic, Pierre-Alain; Hede, P.; Adam, P.

    2004-05-01

    The Internet is a continuously expanding source of multimedia content and information. There are many products in development to search, retrieve, and understand multimedia content. But most of the current image search/retrieval engines, rely on a image database manually pre-indexed with keywords. Computers are still powerless to understand the semantic meaning of still or animated image content. Piria (Program for the Indexing and Research of Images by Affinity), the search engine we have developed brings this possibility closer to reality. Piria is a novel search engine that uses the query by example method. A user query is submitted to the system, which then returns a list of images ranked by similarity, obtained by a metric distance that operates on every indexed image signature. These indexed images are compared according to several different classifiers, not only Keywords, but also Form, Color and Texture, taking into account geometric transformations and variance like rotation, symmetry, mirroring, etc. Form - Edges extracted by an efficient segmentation algorithm. Color - Histogram, semantic color segmentation and spatial color relationship. Texture - Texture wavelets and local edge patterns. If required, Piria is also able to fuse results from multiple classifiers with a new classification of index categories: Single Indexer Single Call (SISC), Single Indexer Multiple Call (SIMC), Multiple Indexers Single Call (MISC) or Multiple Indexers Multiple Call (MIMC). Commercial and industrial applications will be explored and discussed as well as current and future development.

  17. A machine learning model to determine the accuracy of variant calls in capture-based next generation sequencing.

    PubMed

    van den Akker, Jeroen; Mishne, Gilad; Zimmer, Anjali D; Zhou, Alicia Y

    2018-04-17

    Next generation sequencing (NGS) has become a common technology for clinical genetic tests. The quality of NGS calls varies widely and is influenced by features like reference sequence characteristics, read depth, and mapping accuracy. With recent advances in NGS technology and software tools, the majority of variants called using NGS alone are in fact accurate and reliable. However, a small subset of difficult-to-call variants that still do require orthogonal confirmation exist. For this reason, many clinical laboratories confirm NGS results using orthogonal technologies such as Sanger sequencing. Here, we report the development of a deterministic machine-learning-based model to differentiate between these two types of variant calls: those that do not require confirmation using an orthogonal technology (high confidence), and those that require additional quality testing (low confidence). This approach allows reliable NGS-based calling in a clinical setting by identifying the few important variant calls that require orthogonal confirmation. We developed and tested the model using a set of 7179 variants identified by a targeted NGS panel and re-tested by Sanger sequencing. The model incorporated several signals of sequence characteristics and call quality to determine if a variant was identified at high or low confidence. The model was tuned to eliminate false positives, defined as variants that were called by NGS but not confirmed by Sanger sequencing. The model achieved very high accuracy: 99.4% (95% confidence interval: +/- 0.03%). It categorized 92.2% (6622/7179) of the variants as high confidence, and 100% of these were confirmed to be present by Sanger sequencing. Among the variants that were categorized as low confidence, defined as NGS calls of low quality that are likely to be artifacts, 92.1% (513/557) were found to be not present by Sanger sequencing. This work shows that NGS data contains sufficient characteristics for a machine-learning-based model to differentiate low from high confidence variants. Additionally, it reveals the importance of incorporating site-specific features as well as variant call features in such a model.

  18. Pressure-induced critical influences on workpiece-tool thermal interaction in high speed dry machining of titanium

    NASA Astrophysics Data System (ADS)

    Abdel-Aal, H. A.; Mansori, M. El

    2012-12-01

    Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.

  19. A Modeling Tool for Household Biogas Burner Flame Port Design

    NASA Astrophysics Data System (ADS)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  20. Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan

    2012-01-01

    The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.

  1. GEOquery: a bridge between the Gene Expression Omnibus (GEO) and BioConductor.

    PubMed

    Davis, Sean; Meltzer, Paul S

    2007-07-15

    Microarray technology has become a standard molecular biology tool. Experimental data have been generated on a huge number of organisms, tissue types, treatment conditions and disease states. The Gene Expression Omnibus (Barrett et al., 2005), developed by the National Center for Bioinformatics (NCBI) at the National Institutes of Health is a repository of nearly 140,000 gene expression experiments. The BioConductor project (Gentleman et al., 2004) is an open-source and open-development software project built in the R statistical programming environment (R Development core Team, 2005) for the analysis and comprehension of genomic data. The tools contained in the BioConductor project represent many state-of-the-art methods for the analysis of microarray and genomics data. We have developed a software tool that allows access to the wealth of information within GEO directly from BioConductor, eliminating many the formatting and parsing problems that have made such analyses labor-intensive in the past. The software, called GEOquery, effectively establishes a bridge between GEO and BioConductor. Easy access to GEO data from BioConductor will likely lead to new analyses of GEO data using novel and rigorous statistical and bioinformatic tools. Facilitating analyses and meta-analyses of microarray data will increase the efficiency with which biologically important conclusions can be drawn from published genomic data. GEOquery is available as part of the BioConductor project.

  2. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  3. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  4. Digital Images on the DIME

    NASA Technical Reports Server (NTRS)

    2003-01-01

    With NASA on its side, Positive Systems, Inc., of Whitefish, Montana, is veering away from the industry standards defined for producing and processing remotely sensed images. A top developer of imaging products for geographic information system (GIS) and computer-aided design (CAD) applications, Positive Systems is bucking traditional imaging concepts with a cost-effective and time-saving software tool called Digital Images Made Easy (DIME(trademark)). Like piecing a jigsaw puzzle together, DIME can integrate a series of raw aerial or satellite snapshots into a single, seamless panoramic image, known as a 'mosaic.' The 'mosaicked' images serve as useful backdrops to GIS maps - which typically consist of line drawings called 'vectors' - by allowing users to view a multidimensional map that provides substantially more geographic information.

  5. Multi-media authoring - Instruction and training of air traffic controllers based on ASRS incident reports

    NASA Technical Reports Server (NTRS)

    Armstrong, Herbert B.; Roske-Hofstrand, Renate J.

    1989-01-01

    This paper discusses the use of computer-assisted instructions and flight simulations to enhance procedural and perceptual motor task training. Attention is called to the fact that incorporating the accident and incident data contained in reports filed with the Aviation Safety Reporting System (ASRS) would be a valuable training tool which the learner could apply for other situations. The need to segment the events is emphasized; this would make it possible to modify events in order to suit the needs of the training environment. Methods were developed for designing meaningful scenario development on runway incursions on the basis of analysis of ASRS reports. It is noted that, while the development of interactive training tools using the ASRS and other data bases holds much promise, the design and production of interactive video programs and laser disks are very expensive. It is suggested that this problem may be overcome by sharing the costs of production to develop a library of materials available to a broad range of users.

  6. Interim Draft: Biological Sampling and Analysis Plan Outline ...

    EPA Pesticide Factsheets

    Standard Operation Procedures This interim sampling and analysis plan (SAP) outline was developed specifically as an outline of the output that will be generated by a developing on-line tool called the MicroSAP. The goal of the MicroSAP tool is to assist users with development of SAPs needed for site characterization, verification sampling, and post decontamination sampling stages of biological sampling and analysis activities in which the EPA would be responsible for conducting sampling. These activities could include sampling and analysis for a biological contamination incident, a research study, or an exercise. The development of this SAP outline did not consider the initial response of an incident, as it is assumed that the initial response would have been previously completed by another agency during the response, or the clearance phase, as it is assumed that separate committee would be established to make decisions regarding clearing a site. This outline also includes considerations for capturing the associated data quality objectives in the SAP.

  7. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  8. New computational tools for H/D determination in macromolecular structures from neutron data.

    PubMed

    Siliqi, Dritan; Caliandro, Rocco; Carrozzini, Benedetta; Cascarano, Giovanni Luca; Mazzone, Annamaria

    2010-11-01

    Two new computational methods dedicated to neutron crystallography, called n-FreeLunch and DNDM-NDM, have been developed and successfully tested. The aim in developing these methods is to determine hydrogen and deuterium positions in macromolecular structures by using information from neutron density maps. Of particular interest is resolving cases in which the geometrically predicted hydrogen or deuterium positions are ambiguous. The methods are an evolution of approaches that are already applied in X-ray crystallography: extrapolation beyond the observed resolution (known as the FreeLunch procedure) and a difference electron-density modification (DEDM) technique combined with the electron-density modification (EDM) tool (known as DEDM-EDM). It is shown that the two methods are complementary to each other and are effective in finding the positions of H and D atoms in neutron density maps.

  9. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  10. Using appreciative inquiry to facilitate implementation of the recovery model in mental health agencies.

    PubMed

    Clossey, Laurene; Mehnert, Kevin; Silva, Sara

    2011-11-01

    This article describes an organizational development tool called appreciative inquiry (AI) and its use in mental health to aid agencies implementing recovery model services. AI is a discursive tool with the power to shift dominant organizational cultures. Its philosophical underpinnings emphasize values consistent with recovery: community, empowerment, and positive focus. Recent research in the field of mental health demonstrates the salience of organizational cultural context in affecting new service adoption. This article explores how AI could be helpful in shifting an organization's culture to render it compatible with recovery through descriptions of two mental health centers' use of the tool. The experiences described indicate that AI, if used consistently, empowers staff. The article concludes with consideration of the implications of this empowerment for recovery model implementation and directions for future research.

  11. Scientific crowdsourcing in wildlife research and conservation: Tigers (Panthera tigris) as a case study

    PubMed Central

    Can, Özgün Emre; D’Cruze, Neil; Balaskas, Margaret; Macdonald, David W.

    2017-01-01

    With around 3,200 tigers (Panthera tigris) left in the wild, the governments of 13 tiger range countries recently declared that there is a need for innovation to aid tiger research and conservation. In response to this call, we created the “Think for Tigers” study to explore whether crowdsourcing has the potential to innovate the way researchers and practitioners monitor tigers in the wild. The study demonstrated that the benefits of crowdsourcing are not restricted only to harnessing the time, labor, and funds from the public but can also be used as a tool to harness creative thinking that can contribute to development of new research tools and approaches. Based on our experience, we make practical recommendations for designing a crowdsourcing initiative as a tool for generating ideas. PMID:28328924

  12. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  13. Timbre Brownfield Prioritization Tool to support effective brownfield regeneration.

    PubMed

    Pizzol, Lisa; Zabeo, Alex; Klusáček, Petr; Giubilato, Elisa; Critto, Andrea; Frantál, Bohumil; Martinát, Standa; Kunc, Josef; Osman, Robert; Bartke, Stephan

    2016-01-15

    In the last decade, the regeneration of derelict or underused sites, fully or partly located in urban areas (or so called "brownfields"), has become more common, since free developable land (or so called "greenfields") has more and more become a scare and, hence, more expensive resource, especially in densely populated areas. Although the regeneration of brownfield sites can offer development potentials, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects and the proper selection of promising sites is a pre-requisite to efficiently allocate the limited financial resources. The identification and analysis of success factors for brownfield sites regeneration can support investors and decision makers in selecting those sites which are the most advantageous for successful regeneration. The objective of this paper is to present the Timbre Brownfield Prioritization Tool (TBPT), developed as a web-based solution to assist stakeholders responsible for wider territories or clusters of brownfield sites (portfolios) to identify which brownfield sites should be preferably considered for redevelopment or further investigation. The prioritization approach is based on a set of success factors properly identified through a systematic stakeholder engagement procedure. Within the TBPT these success factors are integrated by means of a Multi Criteria Decision Analysis (MCDA) methodology, which includes stakeholders' requalification objectives and perspectives related to the brownfield regeneration process and takes into account the three pillars of sustainability (economic, social and environmental dimensions). The tool has been applied to the South Moravia case study (Czech Republic), considering two different requalification objectives identified by local stakeholders, namely the selection of suitable locations for the development of a shopping centre and a solar power plant, respectively. The application of the TBPT to the case study showed that it is flexible and easy to adapt to different local contexts, allowing the assessors to introduce locally relevant parameters identified according to their expertise and considering the availability of local data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  15. Hair Styling Appliances

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Key tool of Redken Laboratories new line of hair styling appliances is an instrument called a thermograph, a heat sensing device originally developed by Hughes Aircraft Co. under U.S. Army and NASA funding. Redken Laboratories bought one of the early models of the Hughes Probeye Thermal Video System or TVS which detects the various degrees of heat emitted by an object and displays the results in color on a TV monitor with colors representing different temperatures detected.

  16. Afghan National Engineer Brigade: Despite U.S. Training Efforts, the Brigade is Incapable of Operating Independently

    DTIC Science & Technology

    2016-01-01

    carpentry, masonry , and the operation of heavy equipment. Plans called for the NEB to receive at least $29 million in engineering equipment and...JTF Sapper, NMCB 25, and NMCB 28, had responsibility for training the NEB in such areas as plumbing, electrical work, carpentry, masonry , and...measurement tool consisted of five possible ratings: fully capable, capable, partially capable, developing, and established. USFOR-A used these

  17. Diagnosing alternative conceptions of Fermi energy among undergraduate students

    NASA Astrophysics Data System (ADS)

    Sharma, Sapna; Ahluwalia, Pardeep Kumar

    2012-07-01

    Physics education researchers have scientifically established the fact that the understanding of new concepts and interpretation of incoming information are strongly influenced by the preexisting knowledge and beliefs of students, called epistemological beliefs. This can lead to a gap between what students actually learn and what the teacher expects them to learn. In a classroom, as a teacher, it is desirable that one tries to bridge this gap at least on the key concepts of a particular field which is being taught. One such key concept which crops up in statistical physics/solid-state physics courses, and around which the behaviour of materials is described, is Fermi energy (εF). In this paper, we present the results which emerged about misconceptions on Fermi energy in the process of administering a diagnostic tool called the Statistical Physics Concept Survey developed by the authors. It deals with eight themes of basic importance in learning undergraduate solid-state physics and statistical physics. The question items of the tool were put through well-established sequential processes: definition of themes, Delphi study, interview with students, drafting questions, administration, validity and reliability of the tool. The tool was administered to a group of undergraduate students and postgraduate students, in a pre-test and post-test design. In this paper, we have taken one of the themes i.e. Fermi energy of the diagnostic tool for our analysis and discussion. Students’ responses and reasoning comments given during interview were analysed. This analysis helped us to identify prevailing misconceptions/learning gaps among students on this topic. How spreadsheets can be effectively used to remove the identified misconceptions and help appreciate the finer nuances while visualizing the behaviour of the system around Fermi energy, normally sidestepped both by the teachers and learners, is also presented in this paper.

  18. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows

    PubMed Central

    O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774

  19. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows.

    PubMed

    O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).

  20. Specification and Error Pattern Based Program Monitoring

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Johnson, Scott; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We briefly present Java PathExplorer (JPAX), a tool developed at NASA Ames for monitoring the execution of Java programs. JPAX can be used not only during program testing to reveal subtle errors, but also can be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program in order to properly observe its execution. The instrumentation can be either at the bytecode level or at the source level when the source code is available. JPaX is an instance of a more general project, called PathExplorer (PAX), which is a basis for experiments rather than a fixed system, capable of monitoring various programming languages and experimenting with other logics and analysis techniques

  1. GAPIT: genome association and prediction integrated tool.

    PubMed

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  2. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  3. MY SIRR: Minimalist agro-hYdrological model for Sustainable IRRigation management-Soil moisture and crop dynamics

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Manfreda, Salvatore; Celano, Giuseppe

    The paper introduces a minimalist water-driven crop model for sustainable irrigation management using an eco-hydrological approach. Such model, called MY SIRR, uses a relatively small number of parameters and attempts to balance simplicity, accuracy, and robustness. MY SIRR is a quantitative tool to assess water requirements and agricultural production across different climates, soil types, crops, and irrigation strategies. The MY SIRR source code is published under copyleft license. The FOSS approach could lower the financial barriers of smallholders, especially in developing countries, in the utilization of tools for better decision-making on the strategies for short- and long-term water resource management.

  4. Computer-Aided Air-Traffic Control In The Terminal Area

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1995-01-01

    Developmental computer-aided system for automated management and control of arrival traffic at large airport includes three integrated subsystems. One subsystem, called Traffic Management Advisor, another subsystem, called Descent Advisor, and third subsystem, called Final Approach Spacing Tool. Data base that includes current wind measurements and mathematical models of performances of types of aircraft contributes to effective operation of system.

  5. ESIF Call for High-Impact Integrated Projects | Energy Systems Integration

    Science.gov Websites

    Integrated Projects As a U.S. Department of Energy user facility, the Energy Systems Integration Facility concepts, tools, and technologies needed to measure, analyze, predict, protect, and control the grid of the Facility | NREL ESIF Call for High-Impact Integrated Projects ESIF Call for High-Impact

  6. Using WebQuests as Idea Banks for Fostering Autonomy in Online Language Courses

    ERIC Educational Resources Information Center

    Sadaghian, Shirin; Marandi, S. Susan

    2016-01-01

    The concept of language learner autonomy has influenced Computer-Assisted Language Learning (CALL) to the extent that Schwienhorst (2012) informs us of a paradigm change in CALL design in the light of learner autonomy. CALL is not considered a tool anymore, but a learner environment available to language learners anywhere in the world. Based on a…

  7. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.

  8. A new approach to road accident rescue.

    PubMed

    Morales, Alejandro; González-Aguilera, Diego; López, Alfonso I; Gutiérrez, Miguel A

    2016-01-01

    This article develops and validates a new methodology and tool for rescue assistance in traffic accidents, with the aim of improving its efficiency and safety in the evacuation of people, reducing the number of victims in road accidents. Different tests supported by professionals and experts have been designed under different circumstances and with different categories of damaged vehicles coming from real accidents and simulated trapped victims in order to calibrate and refine the proposed methodology and tool. To validate this new approach, a tool called App_Rescue has been developed. This tool is based on the use of a computer system that allows an efficient access to the technical information of the vehicle and sanitary information of the common passengers. The time spent during rescue using the standard protocol and the proposed method was compared. This rescue assistance system allows us to make vital information accessible in posttrauma care services, improving the effectiveness of interventions by the emergency services, reducing the rescue time and therefore minimizing the consequences involved and the number of victims. This could often mean saving lives. In the different simulated rescue operations, the rescue time has been reduced an average of 14%.

  9. Near-Infrared Neuroimaging with NinPy

    PubMed Central

    Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas

    2009-01-01

    There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449

  10. Automation of Coordinated Planning Between Observatories: The Visual Observation Layout Tool (VOLT)

    NASA Technical Reports Server (NTRS)

    Maks, Lori; Koratkar, Anuradha; Kerbel, Uri; Pell, Vince

    2002-01-01

    Fulfilling the promise of the era of great observatories, NASA now has more than three space-based astronomical telescopes operating in different wavebands. This situation provides astronomers with the unique opportunity of simultaneously observing a target in multiple wavebands with these observatories. Currently scheduling multiple observatories simultaneously, for coordinated observations, is highly inefficient. Coordinated observations require painstaking manual collaboration among the observatory staff at each observatory. Because they are time-consuming and expensive to schedule, observatories often limit the number of coordinated observations that can be conducted. In order to exploit new paradigms for observatory operation, the Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center has developed a tool called the Visual Observation Layout Tool (VOLT). The main objective of VOLT is to provide a visual tool to automate the planning of coordinated observations by multiple astronomical observatories. Four of NASA's space-based astronomical observatories - the Hubble Space Telescope (HST), Far Ultraviolet Spectroscopic Explorer (FUSE), Rossi X-ray Timing Explorer (RXTE) and Chandra - are enthusiastically pursuing the use of VOLT. This paper will focus on the purpose for developing VOLT, as well as the lessons learned during the infusion of VOLT into the planning and scheduling operations of these observatories.

  11. VoiceThread: A Useful Program Evaluation Tool

    ERIC Educational Resources Information Center

    Mott, Rebecca

    2018-01-01

    With today's technology, Extension professionals have a variety of tools available for program evaluation. This article describes an innovative platform called VoiceThread that has been used in many classrooms but also is useful for conducting virtual focus group research. I explain how this tool can be used to collect qualitative participant…

  12. ["Baltic Declaration"--telemedicine and mHealth as support for clinical processes in cardiology. The opinion of the Committee of Informatics and Telemedicine of the Polish Society of Cardiology and Telemedicine Clinical Sciences Committee of the PAS].

    PubMed

    Piotrowicz, Ryszard; Grabowski, Marcin; Balsam, Paweł; Kołtowski, Łukasz; Kozierkiewicz, Adam; Zajdel, Justyna; Piotrowicz, Ewa; Kowalski, Oskar; Mitkowski, Przemysław; Kaźmierczak, Jarosław; Kalarus, Zbigniew; Opolski, Grzegorz

    2015-01-01

    For several decades we have observed the development of data transmission technology on an unprecedented scale. With the development of such technology there has also appeared concepts on the use of these solutions in health care systems. Over the last decade telemedicine has been joined by the concept of mHealth, which is based on mobile devices mainly to monitor selected biomedical parameters. On 10 October 2014, during the conference Baltic Electrocardiology Autumn - Telemedicine and Arrhythmia (BEATA), a debate was held with the participation of physicians, politicians, businessmen, and representatives of the Government (Ministry of Health, National Health Fund, Social Insurance Institution) concerning the use of telecardiology services in daily practice. During the meeting issues were discussed such as: telemedicine solutions available throughout the world, analysis of their effectiveness based on clinical trials, funding opportunities, their legal status, and the development perspectives of telecardiology in Poland. The result of the meeting was a document called the "Baltic Declaration". The declaration is a call for proven and profitable technologies to be introduced into clinical practice. The declaration also indicates that the variety of available technological solutions are merely tools, and the utility of such tools stems not only from their modernity, but also primarily from matching their functionality to the features of the health interventions that are to be improved.

  13. Development of an on-line aqueous particle sensor to study the performance of inclusions in a 12 tonne, delta shaped full scale water model tundish

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhishek

    Detection of particulate matter thinly dispersed in a fluid medium with the aid of the difference in electrical conductivity between the pure fluid and the particles has been practiced at least since the last 50 to 60 years. The first such instruments were employed to measure cell counts in samples of biological fluid. Following a detailed study of the physics and principles operating within the device, called the Electric Sensing Zone (ESZ) principle, a new device called the Liquid Metal Cleanliness Analyzer (LiMCA) was invented which could measure and count particles of inclusions in molten metal. It provided a fast and fairly accurate tool to make online measurement of the quality of steel during refining and casting operations. On similar lines of development as the LiMCA, a water analogue of the device called, the Aqueous Particle Sensor (APS) was developed for physical modeling experiments of metal refining operations involving water models. The APS can detect and measure simulated particles of inclusions added to the working fluid (water). The present study involves the designing, building and final application of a new and improved APS in water modeling experiments to study inclusion behavior in a tundish operation. The custom built instrument shows superior performance and applicability in experiments involving physical modeling of metal refining operations, compared to its commercial counterparts. In addition to higher accuracy and range of operating parameters, its capability to take real-time experimental data for extended periods of time helps to reduce the total number of experiments required to reach a result, and makes it suitable for analyzing temporal changes occurring in unsteady systems. With the modern impetus on the quality of the final product of metallurgical operations, the new APS can prove to be an indispensable research tool to study and put forward innovative design and parametric changes in industrially practised metallurgical operations.

  14. SeqMule: automated pipeline for analysis of human exome/genome sequencing data.

    PubMed

    Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai

    2015-09-18

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.

  15. ACTG: novel peptide mapping onto gene models.

    PubMed

    Choi, Seunghyuk; Kim, Hyunwoo; Paek, Eunok

    2017-04-15

    In many proteogenomic applications, mapping peptide sequences onto genome sequences can be very useful, because it allows us to understand origins of the gene products. Existing software tools either take the genomic position of a peptide start site as an input or assume that the peptide sequence exactly matches the coding sequence of a given gene model. In case of novel peptides resulting from genomic variations, especially structural variations such as alternative splicing, these existing tools cannot be directly applied unless users supply information about the variant, either its genomic position or its transcription model. Mapping potentially novel peptides to genome sequences, while allowing certain genomic variations, requires introducing novel gene models when aligning peptide sequences to gene structures. We have developed a new tool called ACTG (Amino aCids To Genome), which maps peptides to genome, assuming all possible single exon skipping, junction variation allowing three edit distances from the original splice sites, exon extension and frame shift. In addition, it can also consider SNVs (single nucleotide variations) during mapping phase if a user provides the VCF (variant call format) file as an input. Available at http://prix.hanyang.ac.kr/ACTG/search.jsp . eunokpaek@hanyang.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. VirSSPA- a virtual reality tool for surgical planning workflow.

    PubMed

    Suárez, C; Acha, B; Serrano, C; Parra, C; Gómez, T

    2009-03-01

    A virtual reality tool, called VirSSPA, was developed to optimize the planning of surgical processes. Segmentation algorithms for Computed Tomography (CT) images: a region growing procedure was used for soft tissues and a thresholding algorithm was implemented to segment bones. The algorithms operate semiautomati- cally since they only need seed selection with the mouse on each tissue segmented by the user. The novelty of the paper is the adaptation of an enhancement method based on histogram thresholding applied to CT images for surgical planning, which simplifies subsequent segmentation. A substantial improvement of the virtual reality tool VirSSPA was obtained with these algorithms. VirSSPA was used to optimize surgical planning, to decrease the time spent on surgical planning and to improve operative results. The success rate increases due to surgeons being able to see the exact extent of the patient's ailment. This tool can decrease operating room time, thus resulting in reduced costs. Virtual simulation was effective for optimizing surgical planning, which could, consequently, result in improved outcomes with reduced costs.

  17. Liquid Phase Sintering

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Industry spends billions of dollars each year on machine tools to manufacture products out of metal. This includes tools for cutting every kind of metal part from engine blocks to Shuttle main engine components. Cutting tool tips often break because of weak spots or defects in their composition. Based on a new concept called defect trapping, space offers a novel environment to study defect formation in molten metal materials as they solidify. After the return of these materials from space, researchers can evaluate the source of the defect and seek ways to eliminate them in products prepared on Earth. A widely used process for cutting tip manufacturing is liquid phase sintering. Compared to Earth-sintered samples which slump due to buoyancy induced by gravity, space samples are uniformly shaped and defects remain where they are formed. By studying metals sintered in space the US tool industry can potentially enhance its worldwide competitiveness. The Consortium for Materials Development in Space along with Wyle Labs, Teledyne Advanced Materials, and McDornell Douglas have conducted experiments in space.

  18. Microgravity

    NASA Image and Video Library

    2004-04-15

    Industry spends billions of dollars each year on machine tools to manufacture products out of metal. This includes tools for cutting every kind of metal part from engine blocks to Shuttle main engine components. Cutting tool tips often break because of weak spots or defects in their composition. Based on a new concept called defect trapping, space offers a novel environment to study defect formation in molten metal materials as they solidify. After the return of these materials from space, researchers can evaluate the source of the defect and seek ways to eliminate them in products prepared on Earth. A widely used process for cutting tip manufacturing is liquid phase sintering. Compared to Earth-sintered samples which slump due to buoyancy induced by gravity, space samples are uniformly shaped and defects remain where they are formed. By studying metals sintered in space the US tool industry can potentially enhance its worldwide competitiveness. The Consortium for Materials Development in Space along with Wyle Labs, Teledyne Advanced Materials, and McDornell Douglas have conducted experiments in space.

  19. Developing and validating a perinatal depression screening tool in Kenya blending Western criteria with local idioms: A mixed methods study.

    PubMed

    Green, Eric P; Tuli, Hawa; Kwobah, Edith; Menya, D; Chesire, Irene; Schmidt, Christina

    2018-03-01

    Routine screening for perinatal depression is not common in most primary health care settings. The U.S. Preventive Services Task Force only recently updated their recommendation on depression screening to specifically recommend screening during the pre- and postpartum periods. While practitioners in high-income countries can respond to this new recommendation by implementing one of several existing depression screening tools developed in Western contexts, such as the Edinburgh Postnatal Depression Scale (EPDS) or the Patient Health Questionnaire-9 (PHQ-9), these tools lack strong evidence of cross-cultural equivalence, validity for case finding, and precision in measuring response to treatment in developing countries. Thus, there is a critical need to develop and validate new screening tools for perinatal depression that can be used by lay health workers, primary health care personnel, and patients. Working in rural Kenya, we used free listing, card sorting, and item analysis methods to develop a locally-relevant screening tool that blended Western psychiatric concepts with local idioms of distress. We conducted a validation study with a random sample of 193 pregnant women and new mothers to test the diagnostic accuracy of this scale along with the EPDS and PHQ-9. The sensitivity/specificity of the EPDS and PHQ-9 was estimated to be 0.70/0.72 and 0.70/0.73, respectively. This compared to sensitivity/specificity of 0.90/0.90 for a new 9-item locally-developed tool called the Perinatal Depression Screening (PDEPS). Across these three tools, internal consistency reliability ranged from 0.77 to 0.81 and test-retest reliability ranged from 0.57 to 0.67. The prevalence of depression ranges from 5.2% to 6.2% depending on the clinical reference standard. The EPDS and PHQ-9 are valid and reliable screening tools for perinatal depression in rural Western Kenya, the PDEPS may be a more useful alternative. At less than 10%, the prevalence of depression in this region appears to be lower than other published estimates for African and other low-income countries. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. The mobile phone as a tool in improving cancer care in Nigeria.

    PubMed

    Odigie, V I; Yusufu, L M D; Dawotola, D A; Ejagwulu, F; Abur, P; Mai, A; Ukwenya, Y; Garba, E S; Rotibi, B B; Odigie, E C

    2012-03-01

    The use of mobile phone as a tool for improving cancer care in a low resource setting. A total of 1176 oncology patients participated in the study. Majority had breast cancer. 58.4% of the patients had no formal education; 10.7 and 9.5% of patients had college or graduate education respectively. Two out of every three patients lived greater than 200 km from hospital or clinic. One half of patients rented a phone to call. At 24 months, 97.6% (1132 patients) had sustained their follow-up appointments as against 19.2% (42 patients) who did not receive the phone intervention. 72.8% (14 102 calls) were to discuss illness/treatment. 14% of the calls were rated as emergency by the oncologist. 86.2% of patients found the use of mobile phone convenient/excellent/cheap. 97.6% found the use of the phone worthwhile and preferred the phone to traveling long distance to hospital/clinic. Also the patients felt that they had not been forgotten by their doctors and were been taken care of outside the hospital/clinic. Low resource countries faced with the burden of cancer care, poor patient follow-up and poor psychosocial support can cash in on this to overcome the persistent problem of poor communication in their healthcare delivery. The potential is enormous to enhance the use of mobile phones in novel ways: developing helpline numbers that can be called for cancer information from prevention to treatment to palliative care. The ability to reach out by mobile phone to a reliable source for medical information about cancer is something that the international community, having experience with helplines, should undertake with colleagues in Africa, who are experimenting with the mobile phone potential. Copyright © 2011 John Wiley & Sons, Ltd.

  1. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  2. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  3. Accounting for reasonableness: Exploring the personal internal framework affecting decisions about cancer drug funding.

    PubMed

    Sinclair, Shane; Hagen, Neil A; Chambers, Carole; Manns, Braden; Simon, Anita; Browman, George P

    2008-05-01

    Drug decision-makers are involved in developing and implementing policy, procedure and processes to support health resource allocation regarding drug treatment formularies. A variety of approaches to decision-making, including formal decision-making frameworks, have been developed to support transparent and fair priority setting. Recently, a decision tool, 'The 6-STEPPPs Tool', was developed to assist in making decisions about new cancer drugs within the public health care system. We conducted a qualitative study, utilizing focus groups and participant observation, in order to investigate the internal frameworks that supported and challenged individual participants as they applied this decision tool within a multi-stakeholder decision process. We discovered that health care resource allocation engaged not only the minds of decision-makers but profoundly called on the often conflicting values of the heart. Objective decision-making frameworks for new drug therapies need to consider the subjective internal frameworks of decision-makers that affect decisions. Understanding the very human, internal turmoil experienced by individuals involved in health care resource allocation, sheds additional insight into how to account for reasonableness and how to better support difficult decisions through transparent, values-based resource allocation policy, procedures and processes.

  4. The APECS Virtual Poster Session: a virtual platform for science communication and discussion

    NASA Astrophysics Data System (ADS)

    Renner, A.; Jochum, K.; Jullion, L.; Pavlov, A.; Liggett, D.; Fugmann, G.; Baeseman, J. L.; Apecs Virtual Poster Session Working Group, T.

    2011-12-01

    The Virtual Poster Session (VPS) of the Association of Polar Early Career Scientists (APECS) was developed by early career scientists as an online tool for communicating and discussing science and research beyond the four walls of a conference venue. Poster sessions often are the backbone of a conference where especially early career scientists get a chance to communicate their research, discuss ideas, data, and scientific problems with their peers and senior scientists. There, they can hone their 'elevator pitch', discussion skills and presentation skills. APECS has taken the poster session one step further and created the VPS - the same idea but independent from conferences, travel, and location. All that is needed is a computer with internet access. Instead of letting their posters collect dust on the computer's hard drive, scientists can now upload them to the APECS website. There, others have the continuous opportunity to comment, give feedback and discuss the work. Currently, about 200 posters are accessible contributed by authors and co-authors from 34 countries. Since January 2010, researchers can discuss their poster with a broad international audience including fellow researchers, community members, potential colleagues and collaborators, policy makers and educators during monthly conference calls via an internet platform. Recordings of the calls are available online afterwards. Calls so far have included topical sessions on e.g. marine biology, glaciology, or social sciences, and interdisciplinary calls on Arctic sciences or polar research activities in a specific country, e.g. India or Romania. They attracted audiences of scientists at all career stages and from all continents, with on average about 15 persons participating per call. Online tools like the VPS open up new ways for creating collaborations and new research ideas and sharing different methodologies for future projects, pushing aside the boundaries of countries and nations, conferences, offices, and disciplines, and provide early career scientists with easily accessible training opportunities for their communication and outreach skills, independent of their location and funding situation.

  5. Serious games for health: three steps forwards.

    PubMed

    Drummond, David; Hadchouel, Alice; Tesnière, Antoine

    2017-01-01

    Serious games are educational tools which are more and more used in patient and health professional education. In this article, we discuss three main points that developers and educators need to address during the development of a serious game for health. We first explain how to develop motivating serious games by finding a point where the intrinsic and extrinsic motivations of end users can converge. Then, we propose to identify the features of serious games which enhance their learning effectiveness on the basis of a framework derived from cognitive science and called "the four pillars of learning." Finally, we discuss issues and solutions related to the evaluation of serious games.

  6. Mobile Phone Call Data as a Regional Socio-Economic Proxy Indicator

    PubMed Central

    Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K.; Ylä-Jääski, Antti

    2015-01-01

    The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's. PMID:25897957

  7. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  8. The Three C's for Urban Science Education

    ERIC Educational Resources Information Center

    Emdin, Chris

    2008-01-01

    In this article, the author outlines briefly what he calls the three C's--a set of tools that can be used to improve urban science education. The author then describes ways that these tools can support students who have traditionally been marginalized. These three aligned and closely connected tools provide practical ways to engage students in…

  9. Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles

    ERIC Educational Resources Information Center

    Carlson, Scott

    2006-01-01

    Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…

  10. Evaluation of Knowla: An Online Assessment and Learning Tool

    ERIC Educational Resources Information Center

    Thompson, Meredith Myra; Braude, Eric John

    2016-01-01

    The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…

  11. Portfolio of Evidence: An Assessment Tool in Promoting Geometry Achievement among Teacher Education College Students

    ERIC Educational Resources Information Center

    Weldeana, Hailu Nigus; Sbhatu, Desta Berhe

    2017-01-01

    Background: This article reports contributions of an assessment tool called Portfolio of Evidence (PE) in learning college geometry. Material and methods: Two classes of second-year students from one Ethiopian teacher education college, assigned into Treatment and Comparison classes, were participated. The assessment tools used in the Treatment…

  12. Fishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, G.

    1984-09-01

    Two classifications of fishing jobs are discussed: open hole and cased hole. When there is no casing in the area of the fish, it is called open hole fishing. When the fish is inside the casing, it is called cased hole fishing. The article lists various things that can become a fish-stuck drill pipe, including: broken drill pipe, drill collars, bit, bit cones, hand tools dropped in the well, sanded up or mud stuck tubing, packers become stuck, and much more. It is suggested that on a fishing job, all parties involved should cooperate with each other, and that fishingmore » tool people obtain all the information concerning the well. That way they can select the right tools and methods to clean out the well as quickly as possible.« less

  13. VirVarSeq: a low-frequency virus variant detection pipeline for Illumina sequencing using adaptive base-calling accuracy filtering.

    PubMed

    Verbist, Bie M P; Thys, Kim; Reumers, Joke; Wetzels, Yves; Van der Borght, Koen; Talloen, Willem; Aerssens, Jeroen; Clement, Lieven; Thas, Olivier

    2015-01-01

    In virology, massively parallel sequencing (MPS) opens many opportunities for studying viral quasi-species, e.g. in HIV-1- and HCV-infected patients. This is essential for understanding pathways to resistance, which can substantially improve treatment. Although MPS platforms allow in-depth characterization of sequence variation, their measurements still involve substantial technical noise. For Illumina sequencing, single base substitutions are the main error source and impede powerful assessment of low-frequency mutations. Fortunately, base calls are complemented with quality scores (Qs) that are useful for differentiating errors from the real low-frequency mutations. A variant calling tool, Q-cpileup, is proposed, which exploits the Qs of nucleotides in a filtering strategy to increase specificity. The tool is imbedded in an open-source pipeline, VirVarSeq, which allows variant calling starting from fastq files. Using both plasmid mixtures and clinical samples, we show that Q-cpileup is able to reduce the number of false-positive findings. The filtering strategy is adaptive and provides an optimized threshold for individual samples in each sequencing run. Additionally, linkage information is kept between single-nucleotide polymorphisms as variants are called at the codon level. This enables virologists to have an immediate biological interpretation of the reported variants with respect to their antiviral drug responses. A comparison with existing SNP caller tools reveals that calling variants at the codon level with Q-cpileup results in an outstanding sensitivity while maintaining a good specificity for variants with frequencies down to 0.5%. The VirVarSeq is available, together with a user's guide and test data, at sourceforge: http://sourceforge.net/projects/virtools/?source=directory. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. Copyright © 2015. Published by Elsevier B.V.

  15. LANL Transfers Glowing Bio Technology to Sandia Biotech

    ScienceCinema

    Nakhla, Tony; Pino, Tony; Hadley, David

    2018-03-02

    Partnering with Los Alamos National Laboratory, an Albuquerque-based company is seeking to transform the way protein and peptide analysis is conducted around the world. Sandia Biotech is using a biological technology licensed from Los Alamos called split green fluorescent protein (sGFP), as a detecting and tracking tool for the protein and peptide industry, valuable in the fields of Alzheimer's research, drug development and other biotechnology fields using protein folding to understand protein expression and mechanisms of action.

  16. LANL Transfers Glowing Bio Technology to Sandia Biotech

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhla, Tony; Pino, Tony; Hadley, David

    2012-05-21

    Partnering with Los Alamos National Laboratory, an Albuquerque-based company is seeking to transform the way protein and peptide analysis is conducted around the world. Sandia Biotech is using a biological technology licensed from Los Alamos called split green fluorescent protein (sGFP), as a detecting and tracking tool for the protein and peptide industry, valuable in the fields of Alzheimer's research, drug development and other biotechnology fields using protein folding to understand protein expression and mechanisms of action.

  17. DNA Assembly with De Bruijn Graphs Using an FPGA Platform.

    PubMed

    Poirier, Carl; Gosselin, Benoit; Fortier, Paul

    2018-01-01

    This paper presents an FPGA implementation of a DNA assembly algorithm, called Ray, initially developed to run on parallel CPUs. The OpenCL language is used and the focus is placed on modifying and optimizing the original algorithm to better suit the new parallelization tool and the radically different hardware architecture. The results show that the execution time is roughly one fourth that of the CPU and factoring energy consumption yields a tenfold savings.

  18. PSPVDC: An Adaptation of the PSP that Incorporates Verified Design by Contract

    DTIC Science & Technology

    2013-05-01

    characteristics mentioned above, including the following: • Java Modeling Language (JML) implements DbC in Java . VDbC can then be carried out using tools like...Extended Static Checking (ESC/ Java ) [Cok 2005] or TACO [Galeotti 2010]. • Perfect Developer [Crocker 2003] is a specification and modeling language...These are written in the language employed in the environment (e.g., as Java Boolean expressions, if JML is used) which we call the carrier lan

  19. Rooting an Android Device

    DTIC Science & Technology

    2015-09-01

    Hat Enterprise Linux, version 6.5 • Android Development Tools (ADT), version 22.3.0-887826 • Saferoot1 • Samsung Galaxy S3 • Dell Precision T7400...method used for the Samsung Galaxy S3 is called Saferoot1—a well- known, open- source software. According to the Saferoot website, the process of...is applicable for the Samsung Galaxy S3 as well as many other Android devices, but there are several steps involved in rooting an Android device (as

  20. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zi-Kui; Gleeson, Brian; Shang, Shunli

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities,more » which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.« less

  1. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  2. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  3. Women's Health Lotería: a new cervical cancer education tool for Hispanic females.

    PubMed

    Sheridan-Leos, N

    1995-05-01

    An innovative public education tool, called Women's Health Lotería (WHL), was created to promote cervical cancer awareness among Hispanic females. The tool covers the risk factors for cervical cancer, the American Cancer Society (ACS) cervical cancer screening guidelines, and the invasive cervical cancer incidence rate in the Hispanic population. Professional journals and books; ACS and National Cancer Institute literature. Scientific evidence strongly suggests that cervical cancer mortality descreases with regular Pap test screening for sexually active women or those who have reached age 18. Many Hispanic women, however, do not know about the importance of Pap testing. WHL was developed to meet this learning need. After attending the educational program, 87% of the respondents achieved the learning objectives. This educational program can be used to educate Hispanic women about cervical cancer. The content and principles also can be applied to other groups of women.

  4. Different Strokes for Different Folks: Visual Presentation Design between Disciplines

    PubMed Central

    Gomez, Steven R.; Jianu, Radu; Ziemkiewicz, Caroline; Guo, Hua; Laidlaw, David H.

    2015-01-01

    We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard “chalk talks”. We found design differences in slideshows using two methods – coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant’s own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information. PMID:26357149

  5. The PDA as a reference tool: libraries' role in enhancing nursing education.

    PubMed

    Scollin, Patrick; Callahan, John; Mehta, Apurva; Garcia, Elizabeth

    2006-01-01

    "The PDA as a Reference Tool: The Libraries' Role in Enhancing Nursing Education" is a pilot project funded by the University of Massachusetts President's Office Information Technology Council through their Professional Development Grant program in 2004. The project's goal is to offer faculty and students in nursing programs at two University of Massachusetts campuses access to an array of medical reference information, such as handbooks, dictionaries, calculators, and diagnostic tools, on small handheld computers called personal digital assistants. Through exposure to the variety of information resources in this digital format, participants can discover and explore these resources at no personal financial cost. Participants borrow handhelds from the University Library's circulation desks. The libraries provide support in routine resynchronizing of handhelds to update information. This report will discuss how the projects were administered, what we learned about what did and did not work, the problems and solutions, and where we hope to go from here.

  6. CRIE: An automated analyzer for Chinese texts.

    PubMed

    Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En

    2016-12-01

    Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.

  7. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    NASA Astrophysics Data System (ADS)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  8. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field

    NASA Technical Reports Server (NTRS)

    Hunter, Paul

    2010-01-01

    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  9. Anaconda: AN automated pipeline for somatic COpy Number variation Detection and Annotation from tumor exome sequencing data.

    PubMed

    Gao, Jianing; Wan, Changlin; Zhang, Huan; Li, Ao; Zang, Qiguang; Ban, Rongjun; Ali, Asim; Yu, Zhenghua; Shi, Qinghua; Jiang, Xiaohua; Zhang, Yuanwei

    2017-10-03

    Copy number variations (CNVs) are the main genetic structural variations in cancer genome. Detecting CNVs in genetic exome region is efficient and cost-effective in identifying cancer associated genes. Many tools had been developed accordingly and yet these tools lack of reliability because of high false negative rate, which is intrinsically caused by genome exonic bias. To provide an alternative option, here, we report Anaconda, a comprehensive pipeline that allows flexible integration of multiple CNV-calling methods and systematic annotation of CNVs in analyzing WES data. Just by one command, Anaconda can generate CNV detection result by up to four CNV detecting tools. Associated with comprehensive annotation analysis of genes involved in shared CNV regions, Anaconda is able to deliver a more reliable and useful report in assistance with CNV-associate cancer researches. Anaconda package and manual can be freely accessed at http://mcg.ustc.edu.cn/bsc/ANACONDA/ .

  10. Different Strokes for Different Folks: Visual Presentation Design between Disciplines.

    PubMed

    Gomez, S R; Jianu, R; Ziemkiewicz, C; Guo, Hua; Laidlaw, D H

    2012-12-01

    We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard "chalk talks". We found design differences in slideshows using two methods - coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant's own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information.

  11. Cause-and-effect mapping of critical events.

    PubMed

    Graves, Krisanne; Simmons, Debora; Galley, Mark D

    2010-06-01

    Health care errors are routinely reported in the scientific and public press and have become a major concern for most Americans. In learning to identify and analyze errors health care can develop some of the skills of a learning organization, including the concept of systems thinking. Modern experts in improving quality have been working in other high-risk industries since the 1920s making structured organizational changes through various frameworks for quality methods including continuous quality improvement and total quality management. When using these tools, it is important to understand systems thinking and the concept of processes within organization. Within these frameworks of improvement, several tools can be used in the analysis of errors. This article introduces a robust tool with a broad analytical view consistent with systems thinking, called CauseMapping (ThinkReliability, Houston, TX, USA), which can be used to systematically analyze the process and the problem at the same time. Copyright 2010 Elsevier Inc. All rights reserved.

  12. The X-windows interactive navigation data editor

    NASA Technical Reports Server (NTRS)

    Rinker, G. C.

    1992-01-01

    A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.

  13. VOTable JAVA Streaming Writer and Applications.

    NASA Astrophysics Data System (ADS)

    Kulkarni, P.; Kembhavi, A.; Kale, S.

    2004-07-01

    Virtual Observatory related tools use a new standard for data transfer called the VOTable format. This is a variant of the xml format that enables easy transfer of data over the web. We describe a streaming interface that can bridge the VOTable format, through a user friendly graphical interface, with the FITS and ASCII formats, which are commonly used by astronomers. A streaming interface is important for efficient use of memory because of the large size of catalogues. The tools are developed in JAVA to provide a platform independent interface. We have also developed a stand-alone version that can be used to convert data stored in ASCII or FITS format on a local machine. The Streaming writer is successfully being used in VOPlot (See Kale et al 2004 for a description of VOPlot).We present the test results of converting huge FITS and ASCII data into the VOTable format on machines that have only limited memory.

  14. Bioinformatics and molecular modeling in glycobiology

    PubMed Central

    Schloissnig, Siegfried

    2010-01-01

    The field of glycobiology is concerned with the study of the structure, properties, and biological functions of the family of biomolecules called carbohydrates. Bioinformatics for glycobiology is a particularly challenging field, because carbohydrates exhibit a high structural diversity and their chains are often branched. Significant improvements in experimental analytical methods over recent years have led to a tremendous increase in the amount of carbohydrate structure data generated. Consequently, the availability of databases and tools to store, retrieve and analyze these data in an efficient way is of fundamental importance to progress in glycobiology. In this review, the various graphical representations and sequence formats of carbohydrates are introduced, and an overview of newly developed databases, the latest developments in sequence alignment and data mining, and tools to support experimental glycan analysis are presented. Finally, the field of structural glycoinformatics and molecular modeling of carbohydrates, glycoproteins, and protein–carbohydrate interaction are reviewed. PMID:20364395

  15. GPS as a tool used in tourism as illustrated by selected mobile applications

    NASA Astrophysics Data System (ADS)

    Szark-Eckardt, Mirosława

    2017-11-01

    Mobile technologies have permanently changed our way of life. Their availability, common use and introducing to virtually all areas of human activity means that we can call present times the age of mobility [1]. Mobile applications based on the GPS module belong to the most dynamically developing apps as particularly reflected in tourism. A multitude of applications dedicated to different participants of tourism, which can be operated by means of smartphones or simple GPS trackers, are encouraging more people to reach for this kind of technology perceiving it as a basic tool used in today's tourism. Due to an increasingly wider access to mobile applications, not only more dynamic development of tourism itself can be noticed, but also the growth of healthy behaviours that comprise a positive "side effect" of tourism based on mobile technology. This article demonstrates a correlation between health and physical condition of the population and the use of mobile applications.

  16. Characterization of Protein Flexibility Using Small-Angle X-Ray Scattering and Amplified Collective Motion Simulations

    PubMed Central

    Wen, Bin; Peng, Junhui; Zuo, Xiaobing; Gong, Qingguo; Zhang, Zhiyong

    2014-01-01

    Large-scale flexibility within a multidomain protein often plays an important role in its biological function. Despite its inherent low resolution, small-angle x-ray scattering (SAXS) is well suited to investigate protein flexibility and determine, with the help of computational modeling, what kinds of protein conformations would coexist in solution. In this article, we develop a tool that combines SAXS data with a previously developed sampling technique called amplified collective motions (ACM) to elucidate structures of highly dynamic multidomain proteins in solution. We demonstrate the use of this tool in two proteins, bacteriophage T4 lysozyme and tandem WW domains of the formin-binding protein 21. The ACM simulations can sample the conformational space of proteins much more extensively than standard molecular dynamics (MD) simulations. Therefore, conformations generated by ACM are significantly better at reproducing the SAXS data than are those from MD simulations. PMID:25140431

  17. Variable context Markov chains for HIV protease cleavage site prediction.

    PubMed

    Oğul, Hasan

    2009-06-01

    Deciphering the knowledge of HIV protease specificity and developing computational tools for detecting its cleavage sites in protein polypeptide chain are very desirable for designing efficient and specific chemical inhibitors to prevent acquired immunodeficiency syndrome. In this study, we developed a generative model based on a generalization of variable order Markov chains (VOMC) for peptide sequences and adapted the model for prediction of their cleavability by certain proteases. The new method, called variable context Markov chains (VCMC), attempts to identify the context equivalence based on the evolutionary similarities between individual amino acids. It was applied for HIV-1 protease cleavage site prediction problem and shown to outperform existing methods in terms of prediction accuracy on a common dataset. In general, the method is a promising tool for prediction of cleavage sites of all proteases and encouraged to be used for any kind of peptide classification problem as well.

  18. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  19. Epsilon-Q: An Automated Analyzer Interface for Mass Spectral Library Search and Label-Free Protein Quantification.

    PubMed

    Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki

    2017-12-01

    Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.

  20. PREPARE: innovative integrated tools and platforms for radiological emergency preparedness and post-accident response in Europe.

    PubMed

    Raskob, Wolfgang; Schneider, Thierry; Gering, Florian; Charron, Sylvie; Zhelezniak, Mark; Andronopoulos, Spyros; Heriard-Dubreuil, Gilles; Camps, Johan

    2015-04-01

    The PREPARE project that started in February 2013 and will end at the beginning of 2016 aims to close gaps that have been identified in nuclear and radiological preparedness in Europe following the first evaluation of the Fukushima disaster. Among others, the project will address the review of existing operational procedures for dealing with long-lasting releases and cross-border problems in radiation monitoring and food safety and further develop missing functionalities in decision support systems (DSS) ranging from improved source-term estimation and dispersion modelling to the inclusion of hydrological pathways for European water bodies. In addition, a so-called Analytical Platform will be developed exploring the scientific and operational means to improve information collection, information exchange and the evaluation of such types of disasters. The tools developed within the project will be partly integrated into the two DSS ARGOS and RODOS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. SmaggIce 2D Version 1.8: Software Toolkit Developed for Aerodynamic Simulation Over Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Vickerman, Mary B.

    2005-01-01

    SmaggIce 2D version 1.8 is a software toolkit developed at the NASA Glenn Research Center that consists of tools for modeling the geometry of and generating the grids for clean and iced airfoils. Plans call for the completed SmaggIce 2D version 2.0 to streamline the entire aerodynamic simulation process--the characterization and modeling of ice shapes, grid generation, and flow simulation--and to be closely coupled with the public-domain application flow solver, WIND. Grid generated using version 1.8, however, can be used by other flow solvers. SmaggIce 2D will help researchers and engineers study the effects of ice accretion on airfoil performance, which is difficult to do with existing software tools because of complex ice shapes. Using SmaggIce 2D, when fully developed, to simulate flow over an iced airfoil will help to reduce the cost of performing flight and wind-tunnel tests for certifying aircraft in natural and simulated icing conditions.

  2. The Development of an eHealth Tool Suite for Prostate Cancer Patients and Their Partners

    PubMed Central

    Van Bogaert, Donna; Hawkins, Robert; Pingree, Suzanne; Jarrard, David

    2013-01-01

    Background eHealth resources for people facing health crises must balance the expert knowledge and perspective of developers and clinicians against the very different needs and perspectives of prospective users. This formative study explores the information and support needs of posttreatment prostate cancer patients and their partners as a way to improve an existing eHealth information and support system called CHESS (Comprehensive Health Enhancement Support System). Methods Focus groups with patient survivors and their partners were used to identify information gaps and information-seeking milestones. Results Both patients and partners expressed a need for assistance in decision making, connecting with experienced patients, and making sexual adjustments. Female partners of patients are more active in searching for cancer information. All partners have information and support needs distinct from those of the patient. Conclusions Findings were used to develop a series of interactive tools and navigational features for the CHESS prostate cancer computer-mediated system. PMID:22591675

  3. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  4. Stitching Techniques Advance Optics Manufacturing

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Because NASA depends on the fabrication and testing of large, high-quality aspheric (nonspherical) optics for applications like the James Webb Space Telescope, it sought an improved method for measuring large aspheres. Through Small Business Innovation Research (SBIR) awards from Goddard Space Flight Center, QED Technologies, of Rochester, New York, upgraded and enhanced its stitching technology for aspheres. QED developed the SSI-A, which earned the company an R&D 100 award, and also developed a breakthrough machine tool called the aspheric stitching interferometer. The equipment is applied to advanced optics in telescopes, microscopes, cameras, medical scopes, binoculars, and photolithography."

  5. Efficient Bulk Data Replication for the Earth System Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sim, Alex; Gunter, Dan; Natarajan, Vijaya

    2010-03-10

    The Earth System Grid (ESG) community faces the difficult challenge of managing the distribution of massive data sets to thousands of scientists around the world. To move data replicas efficiently, the ESG has developed a data transfer management tool called the Bulk Data Mover (BDM). We describe the performance results of the current system and plans towards extending the techniques developed so far for the up- coming project, in which the ESG will employ advanced networks to move multi-TB datasets with the ulti- mate goal of helping researchers understand climate change and its potential impacts on world ecology and society.

  6. CCDST: A free Canadian climate data scraping tool

    NASA Astrophysics Data System (ADS)

    Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.

    2015-02-01

    In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.

  7. Conceptual frameworks in astronomy

    NASA Astrophysics Data System (ADS)

    Pundak, David

    2016-06-01

    How to evaluate students' astronomy understanding is still an open question. Even though some methods and tools to help students have already been developed, the sources of students' difficulties and misunderstanding in astronomy is still unclear. This paper presents an investigation of the development of conceptual systems in astronomy by 50 engineering students, as a result of learning a general course on astronomy. A special tool called Conceptual Frameworks in Astronomy (CFA) that was initially used in 1989, was adapted to gather data for the present research. In its new version, the tool included 23 questions, and five to six optional answers were given for each question. Each of the answers was characterized by one of the four conceptual astronomical frameworks: pre-scientific, geocentric, heliocentric and sidereal or scientific. The paper describes the development of the tool and discusses its validity and reliability. Using the CFA we were able to identify the conceptual frameworks of the students at the beginning of the course and at its end. CFA enabled us to evaluate the paradigmatic change of students following the course and also the extent of the general improvement in astronomical knowledge. It was found that the measure of the students’ improvement (gain index) was g = 0.37. Approximately 45% of the students in the course improved their understanding of conceptual frameworks in astronomy and 26% deepened their understanding of the heliocentric or sidereal conceptual frameworks.

  8. Investigating the Effectiveness of Computer-Assisted Language Learning (CALL) Using Google Documents in Enhancing Writing--A Study on Senior 1 Students in a Chinese Independent High School

    ERIC Educational Resources Information Center

    Ambrose, Regina Maria; Palpanathan, Shanthini

    2017-01-01

    Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…

  9. Creativity and Collaboration: Using CALL to Facilitate International Collaboration for Online Journalism at a Model United Nations Event

    ERIC Educational Resources Information Center

    Sheehan, Mark D.; Thorpe, Todd; Dunn, Robert

    2015-01-01

    Much has been gained over the years in various educational fields that have taken advantage of CALL. In many cases, CALL has facilitated learning and provided teachers and students access to materials and tools that would have remained out of reach were it not for technology. Nonetheless, there are still cases where a lack of funding or access to…

  10. New tools for linking human and earth system models: The Toolbox for Human-Earth System Interaction & Scaling (THESIS)

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.; Kauffman, B.; Lawrence, P.

    2016-12-01

    Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.

  11. R&D Plan for RISMC Industry Application #1: ECCS/LOCA Cladding Acceptance Criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques; Zhang, Hongbin; Epiney, Aaron Simon

    The Nuclear Regulatory Commission (NRC) is finalizing a rulemaking change that would revise the requirements in 10 CFR 50.46. In the proposed new rulemaking, designated as 10 CFR 50.46c, the NRC proposes a fuel performance-based equivalent cladding reacted (ECR) criterion as a function of cladding hydrogen content before the accident (pre-transient) in order to include the effects of higher burnup on cladding performance as well as to address other technical issues. A loss of operational margin may result due to the more restrictive cladding embrittlement criteria. Initial and future compliance with the rule may significantly increase vendor workload and licenseemore » costs as a spectrum of fuel rod initial burnup states may need to be analyzed to demonstrate compliance. The Idaho National Laboratory (INL) has initiated a project, as part of the DOE Light Water Reactor Sustainability Program (LWRS), to develop analytical capabilities to support the industry in the transition to the new rule. This project is called the Industry Application 1 (IA1) within the Risk-Informed Safety Margin Characterization (RISMC) Pathway of LWRS. The general idea behind the initiative is the development of an Integrated Evaluation Model (IEM). The motivation is to develop a multiphysics framework to analyze how uncertainties are propagated across the stream of physical disciplines and data involved, as well as how risks are evaluated in a LOCA safety analysis as regulated under 10 CFR 50.46c. This IEM is called LOTUS which stands for LOCA Toolkit for US, and it represents the LWRS Program’s response to the proposed new rule making. The focus of this report is to complete an R&D plan to describe the demonstration of the LOCA/ECCS RISMC Industry Application # 1 using the advanced RISMC Toolkit and methodologies. This report includes the description and development plan for a RISMC LOCA tool that fully couples advanced MOOSE tools already in development in order to characterize and optimize plant safety and operational margins. Advanced MOOSE tools that are needed to complete this integrated evaluation model are: RAVEN, RELAP-7, BISON, and MAMMOTH.« less

  12. EPA Registers Innovative Tool to Control Corn Rootworm

    EPA Pesticide Factsheets

    Ribonucleic acid interference (RNAi) based Plant Incorporated Protectant (PIP) technology is a new and innovative scientific tool utilized by U.S. growers. Learn more about RNAi technology and the 4 new products containing the RNAi based PIP called SMARTST

  13. Concepts, tools, and strategies for effluent testing: An international survey

    EPA Science Inventory

    Whole effluent testing (also called Direct Toxicity Assessment) remains a critical long-term assessment tool for aquatic environmental protection. Use of animal alternative approaches for wastewater testing is expected to increase as more regulatory authorities routinely require ...

  14. How Can I Deal with My Asthma?

    MedlinePlus

    ... had trouble with it and why. Use asthma management tools. Even if you're feeling absolutely fine, don't abandon tools like daily long-term control medicines (also called "controller" or "maintenance" medicines) if they're a part of your ...

  15. Reusable science tools for analog exploration missions: xGDS Web Tools, VERVE, and Gigapan Voyage

    NASA Astrophysics Data System (ADS)

    Lee, Susan Y.; Lees, David; Cohen, Tamar; Allan, Mark; Deans, Matthew; Morse, Theodore; Park, Eric; Smith, Trey

    2013-10-01

    The Exploration Ground Data Systems (xGDS) project led by the Intelligent Robotics Group (IRG) at NASA Ames Research Center creates software tools to support multiple NASA-led planetary analog field experiments. The two primary tools that fall under the xGDS umbrella are the xGDS Web Tools (xGDS-WT) and Visual Environment for Remote Virtual Exploration (VERVE). IRG has also developed a hardware and software system that is closely integrated with our xGDS tools and is used in multiple field experiments called Gigapan Voyage. xGDS-WT, VERVE, and Gigapan Voyage are examples of IRG projects that improve the ratio of science return versus development effort by creating generic and reusable tools that leverage existing technologies in both hardware and software. xGDS Web Tools provides software for gathering and organizing mission data for science and engineering operations, including tools for planning traverses, monitoring autonomous or piloted vehicles, visualization, documentation, analysis, and search. VERVE provides high performance three dimensional (3D) user interfaces used by scientists, robot operators, and mission planners to visualize robot data in real time. Gigapan Voyage is a gigapixel image capturing and processing tool that improves situational awareness and scientific exploration in human and robotic analog missions. All of these technologies emphasize software reuse and leverage open source and/or commercial-off-the-shelf tools to greatly improve the utility and reduce the development and operational cost of future similar technologies. Over the past several years these technologies have been used in many NASA-led robotic field campaigns including the Desert Research and Technology Studies (DRATS), the Pavilion Lake Research Project (PLRP), the K10 Robotic Follow-Up tests, and most recently we have become involved in the NASA Extreme Environment Mission Operations (NEEMO) field experiments. A major objective of these joint robot and crew experiments is to improve NASAs understanding of how to most effectively execute and increase science return from exploration missions. This paper focuses on an integrated suite of xGDS software and compatible hardware tools: xGDS Web Tools, VERVE, and Gigapan Voyage, how they are used, and the design decisions that were made to allow them to be easily developed, integrated, tested, and reused by multiple NASA field experiments and robotic platforms.

  16. A parallel algorithm for multi-level logic synthesis using the transduction method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Lim, Chieng-Fai

    1991-01-01

    The Transduction Method has been shown to be a powerful tool in the optimization of multilevel networks. Many tools such as the SYLON synthesis system (X90), (CM89), (LM90) have been developed based on this method. A parallel implementation is presented of SYLON-XTRANS (XM89) on an eight processor Encore Multimax shared memory multiprocessor. It minimizes multilevel networks consisting of simple gates through parallel pruning, gate substitution, gate merging, generalized gate substitution, and gate input reduction. This implementation, called Parallel TRANSduction (PTRANS), also uses partitioning to break large circuits up and performs inter- and intra-partition dynamic load balancing. With this, good speedups and high processor efficiencies are achievable without sacrificing the resulting circuit quality.

  17. Digital teaching tools and global learning communities.

    PubMed

    Williams, Mary; Lockhart, Patti; Martin, Cathie

    2015-01-01

    In 2009, we started a project to support the teaching and learning of university-level plant sciences, called Teaching Tools in Plant Biology. Articles in this series are published by the plant science journal, The Plant Cell (published by the American Society of Plant Biologists). Five years on, we investigated how the published materials are being used through an analysis of the Google Analytics pageviews distribution and through a user survey. Our results suggest that this project has had a broad, global impact in supporting higher education, and also that the materials are used differently by individuals in terms of their role (instructor, independent learner, student) and geographical location. We also report on our ongoing efforts to develop a global learning community that encourages discussion and resource sharing.

  18. cisprimertool: software to implement a comparative genomics strategy for the development of conserved intron scanning (CIS) markers.

    PubMed

    Jayashree, B; Jagadeesh, V T; Hoisington, D

    2008-05-01

    The availability of complete, annotated genomic sequence information in model organisms is a rich resource that can be extended to understudied orphan crops through comparative genomic approaches. We report here a software tool (cisprimertool) for the identification of conserved intron scanning regions using expressed sequence tag alignments to a completely sequenced model crop genome. The method used is based on earlier studies reporting the assessment of conserved intron scanning primers (called CISP) within relatively conserved exons located near exon-intron boundaries from onion, banana, sorghum and pearl millet alignments with rice. The tool is freely available to academic users at http://www.icrisat.org/gt-bt/CISPTool.htm. © 2007 ICRISAT.

  19. The Culture Audit: A Leadership Tool for Assessment and Strategic Planning in Diverse Schools and Colleges

    ERIC Educational Resources Information Center

    Bustamante, Rebecca M.

    2006-01-01

    This module is designed to introduce educational leaders to an organizational assessment tool called a "culture audit." Literature on organizational cultural competence suggests that culture audits are a valuable tool for determining how well school policies, programs, and practices respond to the needs of diverse groups and prepare…

  20. HYPATIA--An Online Tool for ATLAS Event Visualization

    ERIC Educational Resources Information Center

    Kourkoumelis, C.; Vourakis, S.

    2014-01-01

    This paper describes an interactive tool for analysis of data from the ATLAS experiment taking place at the world's highest energy particle collider at CERN. The tool, called HYPATIA/applet, enables students of various levels to become acquainted with particle physics and look for discoveries in a similar way to that of real research.

Top