Sample records for quick computational tool

  1. LittleQuickWarp: an ultrafast image warping tool.

    PubMed

    Qu, Lei; Peng, Hanchuan

    2015-02-01

    Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Ten quick tips for machine learning in computational biology.

    PubMed

    Chicco, Davide

    2017-01-01

    Machine learning has become a pivotal tool for many projects in computational biology, bioinformatics, and health informatics. Nevertheless, beginners and biomedical researchers often do not have enough experience to run a data mining project effectively, and therefore can follow incorrect practices, that may lead to common mistakes or over-optimistic results. With this review, we present ten quick tips to take advantage of machine learning in any computational biology context, by avoiding some common errors that we observed hundreds of times in multiple bioinformatics projects. We believe our ten suggestions can strongly help any machine learning practitioner to carry on a successful project in computational biology and related sciences.

  3. Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.

  4. PLASMAP: an interactive computational tool for storage, retrieval and device-independent graphic display of conventional restriction maps.

    PubMed Central

    Stone, B N; Griesinger, G L; Modelevsky, J L

    1984-01-01

    We describe an interactive computational tool, PLASMAP, which allows the user to electronically store, retrieve, and display circular restriction maps. PLASMAP permits users to construct libraries of plasmid restriction maps as a set of files which may be edited in the laboratory at any time. The display feature of PLASMAP quickly generates device-independent, artist-quality, full-color or monochrome, hard copies or CRT screens of complex, conventional circular restriction maps. PMID:6320096

  5. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Penev, M.; Melaina, M.

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  6. Going virtual with quicktime VR: new methods and standardized tools for interactive dynamic visualization of anatomical structures.

    PubMed

    Trelease, R B; Nieder, G L; Dørup, J; Hansen, M S

    2000-04-15

    Continuing evolution of computer-based multimedia technologies has produced QuickTime, a multiplatform digital media standard that is supported by stand-alone commercial programs and World Wide Web browsers. While its core functions might be most commonly employed for production and delivery of conventional video programs (e.g., lecture videos), additional QuickTime VR "virtual reality" features can be used to produce photorealistic, interactive "non-linear movies" of anatomical structures ranging in size from microscopic through gross anatomic. But what is really included in QuickTime VR and how can it be easily used to produce novel and innovative visualizations for education and research? This tutorial introduces the QuickTime multimedia environment, its QuickTime VR extensions, basic linear and non-linear digital video technologies, image acquisition, and other specialized QuickTime VR production methods. Four separate practical applications are presented for light and electron microscopy, dissectable preserved specimens, and explorable functional anatomy in magnetic resonance cinegrams.

  7. Hybrid and Electric Advanced Vehicle Systems Simulation

    NASA Technical Reports Server (NTRS)

    Beach, R. F.; Hammond, R. A.; Mcgehee, R. K.

    1985-01-01

    Predefined components connected to represent wide variety of propulsion systems. Hybrid and Electric Advanced Vehicle System (HEAVY) computer program is flexible tool for evaluating performance and cost of electric and hybrid vehicle propulsion systems. Allows designer to quickly, conveniently, and economically predict performance of proposed drive train.

  8. Computer Aided Self-Forging Fragment Design,

    DTIC Science & Technology

    1978-06-01

    This value is reached so quickly that HEMP solutions using work hardening and those using only elastic—perfectly plastic formulations are quite...Elastic— Plastic Flow, UCRL—7322 , Lawrence Radiation Laboratory , Livermore , California (1969) . 4. Giroux , E. D . , HEMP Users Manual, UCRL—5l079...Laboratory, the HEMP computer code has been developed to serve as an effective design tool to simplify this task considerably. Using this code, warheads 78 06

  9. Google Scholar and the Continuing Education Literature

    ERIC Educational Resources Information Center

    Howland, Jared L.; Howell, Scott; Wright, Thomas C.; Dickson, Cody

    2009-01-01

    The recent introduction of Google Scholar has renewed hope that someday a powerful research tool will bring continuing education literature more quickly, freely, and completely to one's computer. The authors suggest that using Google Scholar with other traditional search methods will narrow the research gap between what is discoverable and…

  10. Digitools: Hi-Tech for the Digital Generation

    ERIC Educational Resources Information Center

    Carver, Diane

    2012-01-01

    Getting students to learn new technology skills can be a challenge. After all, they know everything about computers and the Internet, right? Ninth grade students in Mrs. Aszklar's Digital Media Tools ("Digitools") class at Spanaway Junior High in Washington state quickly discovered they might not know as much as they thought. From…

  11. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  12. Applications of automatic differentiation in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  13. p3d--Python module for structural bioinformatics.

    PubMed

    Fufezan, Christian; Specht, Michael

    2009-08-21

    High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.

  14. A Pythonic Approach for Computational Geosciences and Geo-Data Processing

    NASA Astrophysics Data System (ADS)

    Morra, G.; Yuen, D. A.; Lee, S. M.

    2016-12-01

    Computational methods and data analysis play a constantly increasing role in Earth Sciences however students and professionals need to climb a steep learning curve before reaching a sufficient level that allows them to run effective models. Furthermore the recent arrival and new powerful machine learning tools such as Torch and Tensor Flow has opened new possibilities but also created a new realm of complications related to the completely different technology employed. We present here a series of examples entirely written in Python, a language that combines the simplicity of Matlab with the power and speed of compiled languages such as C, and apply them to a wide range of geological processes such as porous media flow, multiphase fluid-dynamics, creeping flow and many-faults interaction. We also explore ways in which machine learning can be employed in combination with numerical modelling. From immediately interpreting a large number of modeling results to optimizing a set of modeling parameters to obtain a desired optimal simulation. We show that by using Python undergraduate and graduate can learn advanced numerical technologies with a minimum dedicated effort, which in turn encourages them to develop more numerical tools and quickly progress in their computational abilities. We also show how Python allows combining modeling with machine learning as pieces of LEGO, therefore simplifying the transition towards a new kind of scientific geo-modelling. The conclusion is that Python is an ideal tool to create an infrastructure for geosciences that allows users to quickly develop tools, reuse techniques and encourage collaborative efforts to interpret and integrate geo-data in profound new ways.

  15. The Miller Motivation Scale: A New Counselling and Research Tool.

    ERIC Educational Resources Information Center

    Miller, Harold J.

    The Miller Motivation Scale is a 160-item computer scored scale. It was developed to measure quickly and easily and display the motivational profile of the client. It has eight subscales. Five subscales measure encouragement, self-fulfillment and social interest. They are called Creative, Innovative, Productive, Cooperative, and Power. Three…

  16. ARC-2007-ACD07-0140-001

    NASA Image and Video Library

    2007-07-31

    David L. Iverson of NASA Ames Research center, Moffett Field, California, led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. The gyroscopes are flywheels that control the station's attitude without the use of propellant fuel. NASA computer scientists designed the new software, the Inductive Monitoring System, to detect warning signs that precede a gyroscope's failure. According to NASA officials, engineers will add the new software tool to a group of existing tools to identify and track problems related to the gyroscopes. If the software detects warning signs, it will quickly warn the space station's mission control center.

  17. Integration of a neuroimaging processing pipeline into a pan-canadian computing grid

    NASA Astrophysics Data System (ADS)

    Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.

    2012-02-01

    The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.

  18. The Perfect Neuroimaging-Genetics-Computation Storm: Collision of Petabytes of Data, Millions of Hardware Devices and Thousands of Software Tools

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M.; Van Horn, John D.; Toga, Arthur W.

    2013-01-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data. PMID:23975276

  19. QuickProbs 2: Towards rapid construction of high-quality alignments of large protein families

    PubMed Central

    Gudyś, Adam; Deorowicz, Sebastian

    2017-01-01

    The ever-increasing size of sequence databases caused by the development of high throughput sequencing, poses to multiple alignment algorithms one of the greatest challenges yet. As we show, well-established techniques employed for increasing alignment quality, i.e., refinement and consistency, are ineffective when large protein families are investigated. We present QuickProbs 2, an algorithm for multiple sequence alignment. Based on probabilistic models, equipped with novel column-oriented refinement and selective consistency, it offers outstanding accuracy. When analysing hundreds of sequences, Quick-Probs 2 is noticeably better than ClustalΩ and MAFFT, the previous leaders for processing numerous protein families. In the case of smaller sets, for which consistency-based methods are the best performing, QuickProbs 2 is also superior to the competitors. Due to low computational requirements of selective consistency and utilization of massively parallel architectures, presented algorithm has similar execution times to ClustalΩ, and is orders of magnitude faster than full consistency approaches, like MSAProbs or PicXAA. All these make QuickProbs 2 an excellent tool for aligning families ranging from few, to hundreds of proteins. PMID:28139687

  20. Comparison of gross anatomy test scores using traditional specimens vs. QuickTime Virtual Reality animated specimens

    NASA Astrophysics Data System (ADS)

    Maza, Paul Sadiri

    In recent years, technological advances such as computers have been employed in teaching gross anatomy at all levels of education, even in professional schools such as medical and veterinary medical colleges. Benefits of computer based instructional tools for gross anatomy include the convenience of not having to physically view or dissect a cadaver. Anatomy educators debate over the advantages versus the disadvantages of computer based resources for gross anatomy instruction. Many studies, case reports, and editorials argue for the increased use of computer based anatomy educational tools, while others discuss the necessity of dissection for various reasons important in learning anatomy, such as a three-dimensional physical view of the specimen, physical handling of tissues, interactions with fellow students during dissection, and differences between specific specimens. While many articles deal with gross anatomy education using computers, there seems to be a lack of studies investigating the use of computer based resources as an assessment tool for gross anatomy, specifically using the Apple application QuickTime Virtual Reality (QTVR). This study investigated the use of QTVR movie modules to assess if using computer based QTVR movie module assessments were equal in quality to actual physical specimen examinations. A gross anatomy course in the College of Veterinary Medicine at Cornell University was used as a source of anatomy students and gross anatomy examinations. Two groups were compared, one group taking gross anatomy examinations in a traditional manner, by viewing actual physical specimens and answering questions based on those specimens. The other group took the same examinations using the same specimens, but the specimens were viewed as simulated three-dimensional objects in a QTVR movie module. Sample group means for the assessments were compared. A survey was also administered asking students' perceptions of quality and user-friendliness of the QTVR movie modules. The comparison of the two sample group means of the examinations show that there was no difference in results between using QTVR movie modules to test gross anatomy knowledge versus using physical specimens. The results of this study are discussed to explain the benefits of using such computer based anatomy resources in gross anatomy assessments.

  1. Computer Assisted Reading in German as a Foreign Language, Developing and Testing an NLP-Based Application

    ERIC Educational Resources Information Center

    Wood, Peter

    2011-01-01

    "QuickAssist," the program presented in this paper, uses natural language processing (NLP) technologies. It places a range of NLP tools at the disposal of learners, intended to enable them to independently read and comprehend a German text of their choice while they extend their vocabulary, learn about different uses of particular words,…

  2. EMHP: an accurate automated hole masking algorithm for single-particle cryo-EM image processing.

    PubMed

    Berndsen, Zachary; Bowman, Charles; Jang, Haerin; Ward, Andrew B

    2017-12-01

    The Electron Microscopy Hole Punch (EMHP) is a streamlined suite of tools for quick assessment, sorting and hole masking of electron micrographs. With recent advances in single-particle electron cryo-microscopy (cryo-EM) data processing allowing for the rapid determination of protein structures using a smaller computational footprint, we saw the need for a fast and simple tool for data pre-processing that could run independent of existing high-performance computing (HPC) infrastructures. EMHP provides a data preprocessing platform in a small package that requires minimal python dependencies to function. https://www.bitbucket.org/chazbot/emhp Apache 2.0 License. bowman@scripps.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  3. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  4. A Simple Evacuation Modeling and Simulation Tool for First Responders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Daniel B; Payne, Patricia W

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools canmore » quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.« less

  5. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    DOE PAGES

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less

  6. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    PubMed Central

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  7. Automatic Generation of Directive-Based Parallel Programs for Shared Memory Parallel Systems

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Yan, Jerry; Frumkin, Michael

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. Due to its ease of programming and its good performance, the technique has become very popular. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate directive-based, OpenMP, parallel programs. We outline techniques used in the implementation of the tool and present test results on the NAS parallel benchmarks and ARC3D, a CFD application. This work demonstrates the great potential of using computer-aided tools to quickly port parallel programs and also achieve good performance.

  8. Computational algorithm to evaluate product disassembly cost index

    NASA Astrophysics Data System (ADS)

    Zeid, Ibrahim; Gupta, Surendra M.

    2002-02-01

    Environmentally conscious manufacturing is an important paradigm in today's engineering practice. Disassembly is a crucial factor in implementing this paradigm. Disassembly allows the reuse and recycling of parts and products that reach their death after their life cycle ends. There are many questions that must be answered before a disassembly decision can be reached. The most important question is economical. The cost of disassembly versus the cost of scrapping a product is always considered. This paper develops a computational tool that allows decision-makers to calculate the disassembly cost of a product. The tool makes it simple to perform 'what if' scenarios fairly quickly. The tool is Web based and has two main parts. The front-end part is a Web page and runs on the client side in a Web browser, while the back-end part is a disassembly engine (servlet) that has disassembly knowledge and costing algorithms and runs on the server side. The tool is based on the client/server model that is pervasively utilized throughout the World Wide Web. An example is used to demonstrate the implementation and capabilities of the tool.

  9. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    NASA Astrophysics Data System (ADS)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  10. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability andmore » accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.« less

  11. A computer method for schedule processing and quick-time updating.

    NASA Technical Reports Server (NTRS)

    Mccoy, W. H.

    1972-01-01

    A schedule analysis program is presented which can be used to process any schedule with continuous flow and with no loops. Although generally thought of as a management tool, it has applicability to such extremes as music composition and computer program efficiency analysis. Other possibilities for its use include the determination of electrical power usage during some operation such as spacecraft checkout, and the determination of impact envelopes for the purpose of scheduling payloads in launch processing. At the core of the described computer method is an algorithm which computes the position of each activity bar on the output waterfall chart. The algorithm is basically a maximal-path computation which gives to each node in the schedule network the maximal path from the initial node to the given node.

  12. Advanced mathematical on-line analysis in nuclear experiments. Usage of parallel computing CUDA routines in standard root analysis

    NASA Astrophysics Data System (ADS)

    Grzeszczuk, A.; Kowalski, S.

    2015-04-01

    Compute Unified Device Architecture (CUDA) is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs) for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.

  13. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    PubMed

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  14. Use of cloud computing technology in natural hazard assessment and emergency management

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2015-12-01

    During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.

  15. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.

  16. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  17. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  18. [Computer graphic display of retinal examination results. Software improving the quality of documenting fundus changes].

    PubMed

    Jürgens, Clemens; Grossjohann, Rico; Czepita, Damian; Tost, Frank

    2009-01-01

    Graphic documentation of retinal examination results in clinical ophthalmological practice is often depicted using pictures or in handwritten form. Popular software products used to describe changes in the fundus do not vary much from simple graphic programs that enable to insert, scale and edit basic graphic elements such as: a circle, rectangle, arrow or text. Displaying the results of retinal examinations in a unified way is difficult to achieve. Therefore, we devised and implemented modern software tools for this purpose. A computer program enabling to quickly and intuitively form graphs of the fundus, that can be digitally archived or printed was created. Especially for the needs of ophthalmological clinics, a set of standard digital symbols used to document the results of retinal examinations was developed and installed in a library of graphic symbols. These symbols are divided into the following categories: preoperative, postoperative, neovascularization, retinopathy of prematurity. The appropriate symbol can be selected with a click of the mouse and dragged-and-dropped on the canvas of the fundus. Current forms of documenting results of retinal examinations are unsatisfactory, due to the fact that they are time consuming and imprecise. Unequivocal interpretation is difficult or in some cases impossible. Using the developed computer program a sketch of the fundus can be created much more quickly than by hand drawing. Additionally the quality of the medica documentation using a system of well described and standardized symbols will be enhanced. (1) Graphic symbols used to document the results of retinal examinations are a part of everyday clinical practice. (2) The designed computer program will allow quick and intuitive graphical creation of fundus sketches that can be either digitally archived or printed.

  19. Stereolithography: a potential new tool in forensic medicine.

    PubMed

    Dolz, M S; Cina, S J; Smith, R

    2000-06-01

    Stereolithography is a computer-mediated method that can be used to quickly create anatomically correct three-dimensional epoxy and acrylic resin models from various types of medical data. Multiple imaging modalities can be exploited, including computed tomography and magnetic resonance imaging. The technology was first developed and used in 1986 to overcome limitations in previous computer-aided manufacturing/milling techniques. Stereolithography is presently used to accurately reproduce both the external and internal anatomy of body structures. Current medical uses of stereolithography include preoperative planning of orthopedic and maxillofacial surgeries, the fabrication of custom prosthetic devices; and the assessment of the degree of bony and soft-tissue injury caused by trauma. We propose that there is a useful, as yet untapped, potential for this technology in forensic medicine.

  20. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  1. Synthesis of research on work zone delays and simplified application of QuickZone analysis tool.

    DOT National Transportation Integrated Search

    2010-03-01

    The objectives of this project were to synthesize the latest information on work zone safety and management and identify case studies in which FHWAs decision support tool QuickZone or other appropriate analysis tools could be applied. The results ...

  2. The use of inexpensive computer-based scanning survey technology to perform medical practice satisfaction surveys.

    PubMed

    Shumaker, L; Fetterolf, D E; Suhrie, J

    1998-01-01

    The recent availability of inexpensive document scanners and optical character recognition technology has created the ability to process surveys in large numbers with a minimum of operator time. Programs, which allow computer entry of such scanned questionnaire results directly into PC based relational databases, have further made it possible to quickly collect and analyze significant amounts of information. We have created an internal capability to easily generate survey data and conduct surveillance across a number of medical practice sites within a managed care/practice management organization. Patient satisfaction surveys, referring physician surveys and a variety of other evidence gathering tools have been deployed.

  3. Crystallographic CourseWare

    NASA Astrophysics Data System (ADS)

    Kastner, Margaret E.; Vasbinder, Eric; Kowalcyzk, Deborah; Jackson, Sean; Giammalvo, Joseph; Braun, James; Dimarco, Keith

    2000-09-01

    Literature Cited

    1. International Tables for Crystallography: Volume A: Space Group Symmetry; Hanh, T., Ed.; D. Reidel: Boston, 1983.
    2. International Tables for Crystallography, Brief Teaching Edition of Volume A, Space-Group Symmetry, 3rd ed.; Hanh, T., Ed.; Dordrecht: Boston, 1993.
    3. Kastner, M. E. J. Appl. Crystallogr. 1999, 32, 327-331.
    4. Macromedia Director, version 6.5; Macromedia, Inc., San Francisco, CA; 1998.
    5. QuickTime, version 3.0; Apple Computer: Cupertino, CA, 1998.
    6. ToolBook II, Instructor, version 6.0; Asymetrix: Bellevue, WA, 1998.
    7. HyperCard 2.3.5; Apple Computer: Cupertino, CA, 1998.

  4. Conic state extrapolation. [computer program for space shuttle navigation and guidance requirements

    NASA Technical Reports Server (NTRS)

    Shepperd, S. W.; Robertson, W. M.

    1973-01-01

    The Conic State Extrapolation Routine provides the capability to conically extrapolate any spacecraft inertial state vector either backwards or forwards as a function of time or as a function of transfer angle. It is merely the coded form of two versions of the solution of the two-body differential equations of motion of the spacecraft center of mass. Because of its relatively fast computation speed and moderate accuracy, it serves as a preliminary navigation tool and as a method of obtaining quick solutions for targeting and guidance functions. More accurate (but slower) results are provided by the Precision State Extrapolation Routine.

  5. Computer-assisted virtual autopsy using surgical navigation techniques.

    PubMed

    Ebert, Lars Christian; Ruder, Thomas D; Martinez, Rosa Maria; Flach, Patricia M; Schweitzer, Wolf; Thali, Michael J; Ampanozi, Garyfalia

    2015-01-01

    OBJECTIVE; Virtual autopsy methods, such as postmortem CT and MRI, are increasingly being used in forensic medicine. Forensic investigators with little to no training in diagnostic radiology and medical laypeople such as state's attorneys often find it difficult to understand the anatomic orientation of axial postmortem CT images. We present a computer-assisted system that permits postmortem CT datasets to be quickly and intuitively resliced in real time at the body to narrow the gap between radiologic imaging and autopsy. Our system is a potentially valuable tool for planning autopsies, showing findings to medical laypeople, and teaching CT anatomy, thus further closing the gap between radiology and forensic pathology.

  6. Mining for Data

    NASA Technical Reports Server (NTRS)

    1998-01-01

    AbTech Corporation used an F-18 HARV (High Alpha Research Vehicle) simulation developed by NASA to create an interactive computer-based prototype of the MQ (Model Quest) SV (System Validator) tool. Dryden Flight Research Center provided support to develop, test, and rapidly reprogram the validation function. AbTech's ModelQuest Enterprises highly automated and outperforms other modeling techniques to quickly discover meaningful relationships, patterns, and trends in databases. Applications include technical and business professionals in finance, marketing, business, banking, retail, healthcare, and aerospace.

  7. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  8. Adoption of online health management tools among healthy older adults: An exploratory study.

    PubMed

    Zettel-Watson, Laura; Tsukerman, Dmitry

    2016-06-01

    As the population ages and chronic diseases abound, overburdened healthcare systems will increasingly require individuals to manage their own health. Online health management tools, quickly increasing in popularity, have the potential to diminish or even replace in-person contact with health professionals, but overall efficacy and usage trends are unknown. The current study explored perceptions and usage patterns among users of online health management tools, and identified barriers and barrier-breakers among non-users. An online survey was completed by 169 computer users (aged 50+). Analyses revealed that a sizable minority (37%) of participants use online health management tools and most users (89%) are satisfied with these tools, but a limited range of tools are being used and usage occurs in relatively limited domains. Improved awareness and education for online health management tools could enhance people's abilities to remain at home as they age, reducing the financial burden on formal assistance programs. © The Author(s) 2014.

  9. Climate Engine - Monitoring Drought with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Hegewisch, K.; Daudert, B.; Morton, C.; McEvoy, D.; Huntington, J. L.; Abatzoglou, J. T.

    2016-12-01

    Drought has adverse effects on society through reduced water availability and agricultural production and increased wildfire risk. An abundance of remotely sensed imagery and climate data are being collected in near-real time that can provide place-based monitoring and early warning of drought and related hazards. However, in an era of increasing wealth of earth observations, tools that quickly access, compute, and visualize archives, and provide answers at relevant scales to better inform decision-making are lacking. We have developed ClimateEngine.org, a web application that uses Google's Earth Engine platform to enable users to quickly compute and visualize real-time observations. A suite of drought indices allow us to monitor and track drought from local (30-meters) to regional scales and contextualize current droughts within the historical record. Climate Engine is currently being used by U.S. federal agencies and researchers to develop baseline conditions and impact assessments related to agricultural, ecological, and hydrological drought. Climate Engine is also working with the Famine Early Warning Systems Network (FEWS NET) to expedite monitoring agricultural drought over broad areas at risk of food insecurity globally.

  10. Transonic propulsion system integration analysis at McDonnell Aircraft Company

    NASA Technical Reports Server (NTRS)

    Cosner, Raymond R.

    1989-01-01

    The technology of Computational Fluid Dynamics (CFD) is becoming an important tool in the development of aircraft propulsion systems. Two of the most valuable features of CFD are: (1) quick acquisition of flow field data; and (2) complete description of flow fields, allowing detailed investigation of interactions. Current analysis methods complement wind tunnel testing in several ways. Herein, the discussion is focused on CFD methods. However, aircraft design studies need data from both CFD and wind tunnel testing. Each approach complements the other.

  11. Cloud Fingerprinting: Using Clock Skews To Determine Co Location Of Virtual Machines

    DTIC Science & Technology

    2016-09-01

    DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Cloud computing has quickly revolutionized computing practices of organizations, to include the Department of... Cloud computing has quickly revolutionized computing practices of organizations, to in- clude the Department of Defense. However, security concerns...vi Table of Contents 1 Introduction 1 1.1 Proliferation of Cloud Computing . . . . . . . . . . . . . . . . . . 1 1.2 Problem Statement

  12. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  13. Modeling of power transmission and stress grading for corona protection

    NASA Astrophysics Data System (ADS)

    Zohdi, T. I.; Abali, B. E.

    2017-11-01

    Electrical high voltage (HV) machines are prone to corona discharges leading to power losses as well as damage of the insulating layer. Many different techniques are applied as corona protection and computational methods aid to select the best design. In this paper we develop a reduced-order model in 1D estimating electric field and temperature distribution of a conductor wrapped with different layers, as usual for HV-machines. Many assumptions and simplifications are undertaken for this 1D model, therefore, we compare its results to a direct numerical simulation in 3D quantitatively. Both models are transient and nonlinear, giving a possibility to quickly estimate in 1D or fully compute in 3D by a computational cost. Such tools enable understanding, evaluation, and optimization of corona shielding systems for multilayered coils.

  14. Simple tools for assembling and searching high-density picolitre pyrophosphate sequence data.

    PubMed

    Parker, Nicolas J; Parker, Andrew G

    2008-04-18

    The advent of pyrophosphate sequencing makes large volumes of sequencing data available at a lower cost than previously possible. However, the short read lengths are difficult to assemble and the large dataset is difficult to handle. During the sequencing of a virus from the tsetse fly, Glossina pallidipes, we found the need for tools to search quickly a set of reads for near exact text matches. A set of tools is provided to search a large data set of pyrophosphate sequence reads under a "live" CD version of Linux on a standard PC that can be used by anyone without prior knowledge of Linux and without having to install a Linux setup on the computer. The tools permit short lengths of de novo assembly, checking of existing assembled sequences, selection and display of reads from the data set and gathering counts of sequences in the reads. Demonstrations are given of the use of the tools to help with checking an assembly against the fragment data set; investigating homopolymer lengths, repeat regions and polymorphisms; and resolving inserted bases caused by incomplete chain extension. The additional information contained in a pyrophosphate sequencing data set beyond a basic assembly is difficult to access due to a lack of tools. The set of simple tools presented here would allow anyone with basic computer skills and a standard PC to access this information.

  15. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    PubMed

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. The Invasive Species Forecasting System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Most, Neal; Gill, Roger; Ma, Peter

    2011-01-01

    The Invasive Species Forecasting System (ISFS) provides computational support for the generic work processes found in many regional-scale ecosystem modeling applications. Decision support tools built using ISFS allow a user to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of management concern, such as a national park, monument, forest, or refuge. This type of decision product helps resource managers plan invasive species protection, monitoring, and control strategies for the lands they manage. Until now, scientists and resource managers have lacked the data-assembly and computing capabilities to produce these maps quickly and cost efficiently. ISFS focuses on regional-scale habitat suitability modeling for invasive terrestrial plants. ISFS s component architecture emphasizes simplicity and adaptability. Its core services can be easily adapted to produce model-based decision support tools tailored to particular parks, monuments, forests, refuges, and related management units. ISFS can be used to build standalone run-time tools that require no connection to the Internet, as well as fully Internet-based decision support applications. ISFS provides the core data structures, operating system interfaces, network interfaces, and inter-component constraints comprising the canonical workflow for habitat suitability modeling. The predictors, analysis methods, and geographic extents involved in any particular model run are elements of the user space and arbitrarily configurable by the user. ISFS provides small, lightweight, readily hardened core components of general utility. These components can be adapted to unanticipated uses, are tailorable, and require at most a loosely coupled, nonproprietary connection to the Web. Users can invoke capabilities from a command line; programmers can integrate ISFS's core components into more complex systems and services. Taken together, these features enable a degree of decentralization and distributed ownership that have helped other types of scientific information services succeed in recent years.

  17. Got Graphs? An Assessment of Data Visualization Tools

    NASA Technical Reports Server (NTRS)

    Schaefer, C. M.; Foy, M.

    2015-01-01

    Graphs are powerful tools for simplifying complex data. They are useful for quickly assessing patterns and relationships among one or more variables from a dataset. As the amount of data increases, it becomes more difficult to visualize potential associations. Lifetime Surveillance of Astronaut Health (LSAH) was charged with assessing its current visualization tools along with others on the market to determine whether new tools would be useful for supporting NASA's occupational surveillance effort. It was concluded by members of LSAH that the current tools hindered their ability to provide quick results to researchers working with the department. Due to the high volume of data requests and the many iterations of visualizations requested by researchers, software with a better ability to replicate graphs and edit quickly could improve LSAH's efficiency and lead to faster research results.

  18. SUPIN: A Computational Tool for Supersonic Inlet Design

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2016-01-01

    A computational tool named SUPIN is being developed to design and analyze the aerodynamic performance of supersonic inlets. The inlet types available include the axisymmetric pitot, three-dimensional pitot, axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flow-field is divided into parts to provide a framework for the geometry and aerodynamic modeling. Each part of the inlet is defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick design and analysis. SUPIN provides inlet geometry in the form of coordinates, surface angles, and cross-sectional areas. SUPIN can generate inlet surface grids and three-dimensional, structured volume grids for use with higher-fidelity computational fluid dynamics (CFD) analysis. Capabilities highlighted in this paper include the design and analysis of streamline-traced external-compression inlets, modeling of porous bleed, and the design and analysis of mixed-compression inlets. CFD analyses are used to verify the SUPIN results.

  19. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    PubMed

    Tripathi, Kumar Parijat; Evangelista, Daniela; Zuccaro, Antonio; Guarracino, Mario Rosario

    2015-01-01

    RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool), QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery) tools. It offers a report on statistical analysis of functional and Gene Ontology (GO) annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA) by ab initio methods) helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is freely available at: http://www-labgtp.na.icar.cnr.it/Transcriptator.

  20. SedCT: MATLAB™ tools for standardized and quantitative processing of sediment core computed tomography (CT) data collected using a medical CT scanner

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.

    2017-08-01

    Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.

  1. Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool

    NASA Astrophysics Data System (ADS)

    Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong

    2016-06-01

    The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.

  2. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  3. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  4. QuickRNASeq lifts large-scale RNA-seq data analyses to the next level of automation and interactive visualization.

    PubMed

    Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong

    2016-01-08

    RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree of automation and interactivity in QuickRNASeq leads to a substantial reduction in the time and effort required prior to further downstream analyses and interpretation of the analyses findings. QuickRNASeq advances primary RNA-seq data analyses to the next level of automation, and is mature for public release and adoption.

  5. Prosthetic Hand For Holding Rods, Tools, And Handles

    NASA Technical Reports Server (NTRS)

    Belcher, Jewell G., Jr.; Vest, Thomas W.

    1995-01-01

    Prosthetic hand with quick-grip/quick-release lever broadens range of specialized functions available to lower-arm amputee by providing improved capabilities for gripping rods, tools, handles, and like. Includes two stationary lower fingers opposed by one pivoting upper finger. Lever operates in conjunction with attached bracket.

  6. Computing Mass Properties From AutoCAD

    NASA Technical Reports Server (NTRS)

    Jones, A.

    1990-01-01

    Mass properties of structures computed from data in drawings. AutoCAD to Mass Properties (ACTOMP) computer program developed to facilitate quick calculations of mass properties of structures containing many simple elements in such complex configurations as trusses or sheet-metal containers. Mathematically modeled in AutoCAD or compatible computer-aided design (CAD) system in minutes by use of three-dimensional elements. Written in Microsoft Quick-Basic (Version 2.0).

  7. Building complex simulations rapidly using MATRIX(x): The Space Station redesign

    NASA Technical Reports Server (NTRS)

    Carrington, C. K.

    1994-01-01

    MSFC's quick response to the Space Station redesign effort last year required the development of a computer simulation to model the attitude and station-keeping dynamics of a complex body with rotating solar arrays in orbit around the Earth. The simulation was written using a rapid-prototyping graphical simulation and design tool called MATRIX(x) and provided the capability to quickly remodel complex configuration changes by icon manipulation using a mouse. The simulation determines time-dependent inertia properties, and models forces and torques from gravity-gradient, solar radiation, and aerodynamic disturbances. Surface models are easily built from a selection of beams, plates, tetrahedrons, and cylinders. An optimization scheme was written to determine the torque equilibrium attitudes that balance gravity-gradient and aerodynamic torques over an orbit, and propellant-usage estimates were determined. The simulation has been adapted to model the attitude dynamics for small spacecraft.

  8. MSAProbs-MPI: parallel multiple sequence aligner for distributed-memory systems.

    PubMed

    González-Domínguez, Jorge; Liu, Yongchao; Touriño, Juan; Schmidt, Bertil

    2016-12-15

    MSAProbs is a state-of-the-art protein multiple sequence alignment tool based on hidden Markov models. It can achieve high alignment accuracy at the expense of relatively long runtimes for large-scale input datasets. In this work we present MSAProbs-MPI, a distributed-memory parallel version of the multithreaded MSAProbs tool that is able to reduce runtimes by exploiting the compute capabilities of common multicore CPU clusters. Our performance evaluation on a cluster with 32 nodes (each containing two Intel Haswell processors) shows reductions in execution time of over one order of magnitude for typical input datasets. Furthermore, MSAProbs-MPI using eight nodes is faster than the GPU-accelerated QuickProbs running on a Tesla K20. Another strong point is that MSAProbs-MPI can deal with large datasets for which MSAProbs and QuickProbs might fail due to time and memory constraints, respectively. Source code in C ++ and MPI running on Linux systems as well as a reference manual are available at http://msaprobs.sourceforge.net CONTACT: jgonzalezd@udc.esSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Design of a Low Aspect Ratio Transonic Compressor Stage Using CFD Techniques

    NASA Technical Reports Server (NTRS)

    Sanger, Nelson L.

    1994-01-01

    A transonic compressor stage has been designed for the Naval Postgraduate School Turbopropulsion Laboratory. The design relied heavily on CFD techniques while minimizing conventional empirical design methods. The low aspect ratio (1.2) rotor has been designed for a specific head ratio of .25 and a tip relative inlet Mach number of 1.3. Overall stage pressure ratio is 1.56. The rotor was designed using an Euler code augmented by a distributed body force model to account for viscous effects. This provided a relatively quick-running design tool, and was used for both rotor and stator calculations. The initial stator sections were sized using a compressible, cascade panel code. In addition to being used as a case study for teaching purposes, the compressor stage will be used as a research stage. Detailed measurements, including non-intrusive LDV, will be compared with the design computations, and with the results of other CFD codes, as a means of assessing and improving the computational codes as design tools.

  10. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  11. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    NASA Astrophysics Data System (ADS)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  12. Implementation of a web-based, interactive polytrauma tutorial in computed tomography for radiology residents: how we do it.

    PubMed

    Schlorhaufer, C; Behrends, M; Diekhaus, G; Keberle, M; Weidemann, J

    2012-12-01

    Due to the time factor in polytraumatized patients all relevant pathologies in a polytrauma computed tomography (CT) scan have to be read and communicated very quickly. During radiology residency acquisition of effective reading schemes based on typical polytrauma pathologies is very important. Thus, an online tutorial for the structured diagnosis of polytrauma CT was developed. Based on current multimedia theories like the cognitive load theory a didactic concept was developed. As a web-environment the learning management system ILIAS was chosen. CT data sets were converted into online scrollable QuickTime movies. Audiovisual tutorial movies with guided image analyses by a consultant radiologist were recorded. The polytrauma tutorial consists of chapterized text content and embedded interactive scrollable CT data sets. Selected trauma pathologies are demonstrated to the user by guiding tutor movies. Basic reading schemes are communicated with the help of detailed commented movies of normal data sets. Common and important pathologies could be explored in a self-directed manner. Ambitious didactic concepts can be supported by a web based application on the basis of cognitive load theory and currently available software tools. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. Simulation of mixing in the quick quench region of a rich burn-quick quench mix-lean burn combustor

    NASA Technical Reports Server (NTRS)

    Shih, Tom I.-P.; Nguyen, H. Lee; Howe, Gregory W.; Li, Z.

    1991-01-01

    A computer program was developed to study the mixing process in the quick quench region of a rich burn-quick quench mix-lean burn combustor. The computer program developed was based on the density-weighted, ensemble-averaged conservation equations of mass, momentum (full compressible Navier-Stokes), total energy, and species, closed by a k-epsilon turbulence model with wall functions. The combustion process was modeled by a two-step global reaction mechanism, and NO(x) formation was modeled by the Zeldovich mechanism. The formulation employed in the computer program and the essence of the numerical method of solution are described. Some results obtained for nonreacting and reacting flows with different main-flow to dilution-jet momentum flux ratios are also presented.

  14. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  15. OverPlotter: A Utility for Herschel Data Processing

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Mei, Y.; Schulz, B.

    2008-08-01

    The OverPlotter utility is a GUI tool written in Java to support interactive data processing (DP) and analysis for the Herschel Space Observatory within the framework of the Herschel Common Science System (HCSS)(Wieprecht et al 2004). The tool expands upon the capabilities of the TableViewer (Zhang & Schulz 2005), providing now also the means to create additional overlays of several X/Y scatter plots within the same display area. These layers can be scaled and panned, either individually, or together as one graph. Visual comparison of data with different origins and units becomes much easier. The number of available layers is not limited, except by computer memory and performance. Presentation images can be easily created by adding annotations, labeling layers and setting colors. The tool will be very helpful especially in the early phases of Herschel data analysis, when a quick access to contents of data products is important.

  16. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  17. Simulating pad-electrodes with high-definition arrays in transcranial electric stimulation

    NASA Astrophysics Data System (ADS)

    Kempe, René; Huang, Yu; Parra, Lucas C.

    2014-04-01

    Objective. Research studies on transcranial electric stimulation, including direct current, often use a computational model to provide guidance on the placing of sponge-electrode pads. However, the expertise and computational resources needed for finite element modeling (FEM) make modeling impractical in a clinical setting. Our objective is to make the exploration of different electrode configurations accessible to practitioners. We provide an efficient tool to estimate current distributions for arbitrary pad configurations while obviating the need for complex simulation software. Approach. To efficiently estimate current distributions for arbitrary pad configurations we propose to simulate pads with an array of high-definition (HD) electrodes and use an efficient linear superposition to then quickly evaluate different electrode configurations. Main results. Numerical results on ten different pad configurations on a normal individual show that electric field intensity simulated with the sampled array deviates from the solutions with pads by only 5% and the locations of peak magnitude fields have a 94% overlap when using a dense array of 336 electrodes. Significance. Computationally intensive FEM modeling of the HD array needs to be performed only once, perhaps on a set of standard heads that can be made available to multiple users. The present results confirm that by using these models one can now quickly and accurately explore and select pad-electrode montages to match a particular clinical need.

  18. High-Fidelity Multidisciplinary Design Optimization of Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Martins, Joaquim R. R. A.; Kenway, Gaetan K. W.; Burdette, David; Jonsson, Eirikur; Kennedy, Graeme J.

    2017-01-01

    To evaluate new airframe technologies we need design tools based on high-fidelity models that consider multidisciplinary interactions early in the design process. The overarching goal of this NRA is to develop tools that enable high-fidelity multidisciplinary design optimization of aircraft configurations, and to apply these tools to the design of high aspect ratio flexible wings. We develop a geometry engine that is capable of quickly generating conventional and unconventional aircraft configurations including the internal structure. This geometry engine features adjoint derivative computation for efficient gradient-based optimization. We also added overset capability to a computational fluid dynamics solver, complete with an adjoint implementation and semiautomatic mesh generation. We also developed an approach to constraining buffet and started the development of an approach for constraining utter. On the applications side, we developed a new common high-fidelity model for aeroelastic studies of high aspect ratio wings. We performed optimal design trade-o s between fuel burn and aircraft weight for metal, conventional composite, and carbon nanotube composite wings. We also assessed a continuous morphing trailing edge technology applied to high aspect ratio wings. This research resulted in the publication of 26 manuscripts so far, and the developed methodologies were used in two other NRAs. 1

  19. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets.

    PubMed

    Panta, Sandeep R; Wang, Runtang; Fries, Jill; Kalyanam, Ravi; Speer, Nicole; Banich, Marie; Kiehl, Kent; King, Margaret; Milham, Michael; Wager, Tor D; Turner, Jessica A; Plis, Sergey M; Calhoun, Vince D

    2016-01-01

    In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI) scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS). We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed stochastic neighbor embedding (t-SNE) algorithm which reduces the number of dimensions for each scan in the input data set to two dimensions while preserving the local structure of data sets. Finally, we interactively display the output of this approach via a web-page, based on data driven documents (D3) JavaScript library. Two distinct approaches were used to visualize the data. In the first approach, we computed multiple quality control (QC) values from pre-processed data, which were used as inputs to the t-SNE algorithm. This approach helps in assessing the quality of each data set relative to others. In the second case, computed variables of interest (e.g., brain volume or voxel values from segmented gray matter images) were used as inputs to the t-SNE algorithm. This approach helps in identifying interesting patterns in the data sets. We demonstrate these approaches using multiple examples from over 10,000 data sets including (1) quality control measures calculated from phantom data over time, (2) quality control data from human functional MRI data across various studies, scanners, sites, (3) volumetric and density measures from human structural MRI data across various studies, scanners and sites. Results from (1) and (2) show the potential of our approach to combine t-SNE data reduction with interactive color coding of variables of interest to quickly identify visually unique clusters of data (i.e., data sets with poor QC, clustering of data by site) quickly. Results from (3) demonstrate interesting patterns of gray matter and volume, and evaluate how they map onto variables including scanners, age, and gender. In sum, the proposed approach allows researchers to rapidly identify and extract meaningful information from big data sets. Such tools are becoming increasingly important as datasets grow larger.

  20. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  1. A Software Engineering Paradigm for Quick-turnaround Earth Science Data Projects

    NASA Astrophysics Data System (ADS)

    Moore, K.

    2016-12-01

    As is generally the case with applied sciences professional and educational programs, the participants of such programs can come from a variety of technical backgrounds. In the NASA DEVELOP National Program, the participants constitute an interdisciplinary set of backgrounds, with varying levels of experience with computer programming. DEVELOP makes use of geographically explicit data sets, and it is necessary to use geographic information systems and geospatial image processing environments. As data sets cover longer time spans and include more complex sets of parameters, automation is becoming an increasingly prevalent feature. Though platforms such as ArcGIS, ERDAS Imagine, and ENVI facilitate the batch-processing of geospatial imagery, these environments are naturally constricting to the user in that they limit him or her to the tools that are available. Users must then turn to "homemade" scripting in more traditional programming languages such as Python, JavaScript, or R, to automate workflows. However, in the context of quick-turnaround projects like those in DEVELOP, the programming learning curve may be prohibitively steep. In this work, we consider how to best design a software development paradigm that addresses two major constants: an arbitrarily experienced programmer and quick-turnaround project timelines.

  2. MACMULTIVIEW 5.1

    NASA Technical Reports Server (NTRS)

    Norikane, L.

    1994-01-01

    MacMultiview is an interactive tool for the Macintosh II family which allows one to display and make computations utilizing polarimetric radar data collected by the Jet Propulsion Laboratory's imaging SAR (synthetic aperture radar) polarimeter system. The system includes the single-frequency L-band sensor mounted on the NASA CV990 aircraft and its replacement, the multi-frequency P-, L-, and C-band sensors mounted on the NASA DC-8. MacMultiview provides two basic functions: computation of synthesized polarimetric images and computation of polarization signatures. The radar data can be used to compute a variety of images. The total power image displays the sum of the polarized and unpolarized components of the backscatter for each pixel. The magnitude/phase difference image displays the HH (horizontal transmit and horizontal receive polarization) to VV (vertical transmit and vertical receive polarization) phase difference using color. Magnitude is displayed using intensity. The user may also select any transmit and receive polarization combination from which an image is synthesized. This image displays the backscatter which would have been observed had the sensor been configured using the selected transmit and receive polarizations. MacMultiview can also be used to compute polarization signatures, three dimensional plots of backscatter versus transmit and receive polarizations. The standard co-polarization signatures (transmit and receive polarizations are the same) and cross-polarization signatures (transmit and receive polarizations are orthogonal) can be plotted for any rectangular subset of pixels within a radar data set. In addition, the ratio of co- and cross-polarization signatures computed from different subsets within the same data set can also be computed. Computed images can be saved in a variety of formats: byte format (headerless format which saves the image as a string of byte values), MacMultiview (a byte image preceded by an ASCII header), and PICT2 format (standard format readable by MacMultiview and other image processing programs for the Macintosh). Images can also be printed on PostScript output devices. Polarization signatures can be saved in either a PICT format or as a text file containing PostScript commands and printed on any QuickDraw output device. The associated Stokes matrices can be stored in a text file. MacMultiview is written in C-language for Macintosh II series computers. MacMultiview will only run on Macintosh II series computers with 8-bit video displays (gray shades or color). The program also requires a minimum configuration of System 6.0, Finder 6.1, and 1Mb of RAM. MacMultiview is NOT compatible with System 7.0. It requires 32-Bit QuickDraw. Note: MacMultiview may not be fully compatible with preliminary versions of 32-Bit QuickDraw. Macintosh Programmer's Workshop and Macintosh Programmer's Workshop C (version 3.0) are required for recompiling and relinking. The standard distribution medium for this package is a set of three 800K 3.5 inch diskettes in Macintosh format. This program was developed in 1989 and updated in 1991. MacMultiview is a copyrighted work with all copyright vested in NASA. QuickDraw, Finder, Macintosh, and System 7 are trademarks of Apple Computer, Inc.

  3. Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.

    PubMed

    Hokamp, Karsten

    2015-01-01

    Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol.

  4. Scratch Your Brain Where It Itches: Math Games, Tricks and Quick Activities, Book C-1.

    ERIC Educational Resources Information Center

    Brumbaugh, Doug

    This resource book contains mathematical games, tricks, and quick activities for the classroom. Categories of activities include computation, manipulative challenges, puzzlers, picky puzzlers, patterns, measurement, money, and riddles. The computation section contains 13 classroom games and activities along with 4 manipulative challenges.…

  5. Exploring FlyBase Data Using QuickSearch.

    PubMed

    Marygold, Steven J; Antonazzo, Giulia; Attrill, Helen; Costa, Marta; Crosby, Madeline A; Dos Santos, Gilberto; Goodman, Joshua L; Gramates, L Sian; Matthews, Beverley B; Rey, Alix J; Thurmond, Jim

    2016-12-08

    FlyBase (flybase.org) is the primary online database of genetic, genomic, and functional information about Drosophila species, with a major focus on the model organism Drosophila melanogaster. The long and rich history of Drosophila research, combined with recent surges in genomic-scale and high-throughput technologies, mean that FlyBase now houses a huge quantity of data. Researchers need to be able to rapidly and intuitively query these data, and the QuickSearch tool has been designed to meet these needs. This tool is conveniently located on the FlyBase homepage and is organized into a series of simple tabbed interfaces that cover the major data and annotation classes within the database. This unit describes the functionality of all aspects of the QuickSearch tool. With this knowledge, FlyBase users will be equipped to take full advantage of all QuickSearch features and thereby gain improved access to data relevant to their research. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  6. Software For Design And Analysis Of Tanks And Cylindrical Shells

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.; Graham, Jerry B.

    1995-01-01

    Skin-stringer Tank Analysis Spreadsheet System (STASS) computer program developed for use as preliminary design software tool that enables quick-turnaround design and analysis of structural domes and cylindrical barrel sections in propellant tanks or other cylindrical shells. Determines minimum required skin thicknesses for domes and cylindrical shells to withstand material failure due to applied pressures (ullage and/or hydrostatic) and runs buckling analyses on cylindrical shells and skin-stringers. Implemented as workbook program, using Microsoft Excel v4.0 on Macintosh II. Also implemented using Microsoft Excel v4.0 for Microsoft Windows v3.1 IBM PC.

  7. 3D printed microfluidic mixer for point-of-care diagnosis of anemia.

    PubMed

    Plevniak, Kimberly; Campbell, Matthew; Mei He

    2016-08-01

    3D printing has been an emerging fabrication tool in prototyping and manufacturing. We demonstrated a 3D microfluidic simulation guided computer design and 3D printer prototyping for quick turnaround development of microfluidic 3D mixers, which allows fast self-mixing of reagents with blood through capillary force. Combined with smartphone, the point-of-care diagnosis of anemia from finger-prick blood has been successfully implemented and showed consistent results with clinical measurements. Capable of 3D fabrication flexibility and smartphone compatibility, this work presents a novel diagnostic strategy for advancing personalized medicine and mobile healthcare.

  8. NGS-based approach to determine the presence of HPV and their sites of integration in human cancer genome.

    PubMed

    Chandrani, P; Kulkarni, V; Iyer, P; Upadhyay, P; Chaubal, R; Das, P; Mulherkar, R; Singh, R; Dutt, A

    2015-06-09

    Human papilloma virus (HPV) accounts for the most common cause of all virus-associated human cancers. Here, we describe the first graphic user interface (GUI)-based automated tool 'HPVDetector', for non-computational biologists, exclusively for detection and annotation of the HPV genome based on next-generation sequencing data sets. We developed a custom-made reference genome that comprises of human chromosomes along with annotated genome of 143 HPV types as pseudochromosomes. The tool runs on a dual mode as defined by the user: a 'quick mode' to identify presence of HPV types and an 'integration mode' to determine genomic location for the site of integration. The input data can be a paired-end whole-exome, whole-genome or whole-transcriptome data set. The HPVDetector is available in public domain for download: http://www.actrec.gov.in/pi-webpages/AmitDutt/HPVdetector/HPVDetector.html. On the basis of our evaluation of 116 whole-exome, 23 whole-transcriptome and 2 whole-genome data, we were able to identify presence of HPV in 20 exomes and 4 transcriptomes of cervical and head and neck cancer tumour samples. Using the inbuilt annotation module of HPVDetector, we found predominant integration of viral gene E7, a known oncogene, at known 17q21, 3q27, 7q35, Xq28 and novel sites of integration in the human genome. Furthermore, co-infection with high-risk HPVs such as 16 and 31 were found to be mutually exclusive compared with low-risk HPV71. HPVDetector is a simple yet precise and robust tool for detecting HPV from tumour samples using variety of next-generation sequencing platforms including whole genome, whole exome and transcriptome. Two different modes (quick detection and integration mode) along with a GUI widen the usability of HPVDetector for biologists and clinicians with minimal computational knowledge.

  9. ORBIT: an integrated environment for user-customized bioinformatics tools.

    PubMed

    Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M

    1999-10-01

    There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.

  10. Application of Reduced Order Transonic Aerodynamic Influence Coefficient Matrix for Design Optimization

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley W.

    2009-01-01

    Supporting the Aeronautics Research Mission Directorate guidelines, the National Aeronautics and Space Administration [NASA] Dryden Flight Research Center is developing a multidisciplinary design, analysis, and optimization [MDAO] tool. This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Today s modern aircraft designs in transonic speed are a challenging task due to the computation time required for the unsteady aeroelastic analysis using a Computational Fluid Dynamics [CFD] code. Design approaches in this speed regime are mainly based on the manual trial and error. Because of the time required for unsteady CFD computations in time-domain, this will considerably slow down the whole design process. These analyses are usually performed repeatedly to optimize the final design. As a result, there is considerable motivation to be able to perform aeroelastic calculations more quickly and inexpensively. This paper will describe the development of unsteady transonic aeroelastic design methodology for design optimization using reduced modeling method and unsteady aerodynamic approximation. The method requires the unsteady transonic aerodynamics be represented in the frequency or Laplace domain. Dynamically linear assumption is used for creating Aerodynamic Influence Coefficient [AIC] matrices in transonic speed regime. Unsteady CFD computations are needed for the important columns of an AIC matrix which corresponded to the primary modes for the flutter. Order reduction techniques, such as Guyan reduction and improved reduction system, are used to reduce the size of problem transonic flutter can be found by the classic methods, such as Rational function approximation, p-k, p, root-locus etc. Such a methodology could be incorporated into MDAO tool for design optimization at a reasonable computational cost. The proposed technique is verified using the Aerostructures Test Wing 2 actually designed, built, and tested at NASA Dryden Flight Research Center. The results from the full order model and the approximate reduced order model are analyzed and compared.

  11. Reconstructing evolutionary trees in parallel for massive sequences.

    PubMed

    Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam

    2017-12-14

    Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .

  12. Multi -omics and metabolic modelling pipelines: challenges and tools for systems microbiology.

    PubMed

    Fondi, Marco; Liò, Pietro

    2015-02-01

    Integrated -omics approaches are quickly spreading across microbiology research labs, leading to (i) the possibility of detecting previously hidden features of microbial cells like multi-scale spatial organization and (ii) tracing molecular components across multiple cellular functional states. This promises to reduce the knowledge gap between genotype and phenotype and poses new challenges for computational microbiologists. We underline how the capability to unravel the complexity of microbial life will strongly depend on the integration of the huge and diverse amount of information that can be derived today from -omics experiments. In this work, we present opportunities and challenges of multi -omics data integration in current systems biology pipelines. We here discuss which layers of biological information are important for biotechnological and clinical purposes, with a special focus on bacterial metabolism and modelling procedures. A general review of the most recent computational tools for performing large-scale datasets integration is also presented, together with a possible framework to guide the design of systems biology experiments by microbiologists. Copyright © 2015. Published by Elsevier GmbH.

  13. Development and preliminary validation of an index for indicating the risks of the design of working hours to health and wellbeing.

    PubMed

    Schomann, Carsten; Giebel, Ole; Nachreiner, Friedhelm

    2006-01-01

    BASS 4, a computer program for the design and evaluation of workings hours, is an example of an ergonomics-based software tool that can be used by safety practitioners at the shop floor with regard to legal, ergonomic, and economic criteria. Based on experiences with this computer program, a less sophisticated Working-Hours-Risk Index for assessing the quality of work schedules (including flexible work hours) to indicate risks to health and wellbeing has been developed to provide a quick and easy applicable tool for legally required risk assessments. The results of a validation study show that this risk index seems to be a promising indicator for predicting risks of health complaints and wellbeing. The purpose of the Risk Index is to simplify the evaluation process at the shop floor and provide some more general information about the quality of a work schedule that can be used for triggering preventive interventions. Such a risk index complies with practitioners' expectations and requests for easy, useful, and valid instruments.

  14. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  15. PANGEA: pipeline for analysis of next generation amplicons

    PubMed Central

    Giongo, Adriana; Crabb, David B; Davis-Richardson, Austin G; Chauliac, Diane; Mobberley, Jennifer M; Gano, Kelsey A; Mukherjee, Nabanita; Casella, George; Roesch, Luiz FW; Walts, Brandon; Riva, Alberto; King, Gary; Triplett, Eric W

    2010-01-01

    High-throughput DNA sequencing can identify organisms and describe population structures in many environmental and clinical samples. Current technologies generate millions of reads in a single run, requiring extensive computational strategies to organize, analyze and interpret those sequences. A series of bioinformatics tools for high-throughput sequencing analysis, including preprocessing, clustering, database matching and classification, have been compiled into a pipeline called PANGEA. The PANGEA pipeline was written in Perl and can be run on Mac OSX, Windows or Linux. With PANGEA, sequences obtained directly from the sequencer can be processed quickly to provide the files needed for sequence identification by BLAST and for comparison of microbial communities. Two different sets of bacterial 16S rRNA sequences were used to show the efficiency of this workflow. The first set of 16S rRNA sequences is derived from various soils from Hawaii Volcanoes National Park. The second set is derived from stool samples collected from diabetes-resistant and diabetes-prone rats. The workflow described here allows the investigator to quickly assess libraries of sequences on personal computers with customized databases. PANGEA is provided for users as individual scripts for each step in the process or as a single script where all processes, except the χ2 step, are joined into one program called the ‘backbone’. PMID:20182525

  16. PANGEA: pipeline for analysis of next generation amplicons.

    PubMed

    Giongo, Adriana; Crabb, David B; Davis-Richardson, Austin G; Chauliac, Diane; Mobberley, Jennifer M; Gano, Kelsey A; Mukherjee, Nabanita; Casella, George; Roesch, Luiz F W; Walts, Brandon; Riva, Alberto; King, Gary; Triplett, Eric W

    2010-07-01

    High-throughput DNA sequencing can identify organisms and describe population structures in many environmental and clinical samples. Current technologies generate millions of reads in a single run, requiring extensive computational strategies to organize, analyze and interpret those sequences. A series of bioinformatics tools for high-throughput sequencing analysis, including pre-processing, clustering, database matching and classification, have been compiled into a pipeline called PANGEA. The PANGEA pipeline was written in Perl and can be run on Mac OSX, Windows or Linux. With PANGEA, sequences obtained directly from the sequencer can be processed quickly to provide the files needed for sequence identification by BLAST and for comparison of microbial communities. Two different sets of bacterial 16S rRNA sequences were used to show the efficiency of this workflow. The first set of 16S rRNA sequences is derived from various soils from Hawaii Volcanoes National Park. The second set is derived from stool samples collected from diabetes-resistant and diabetes-prone rats. The workflow described here allows the investigator to quickly assess libraries of sequences on personal computers with customized databases. PANGEA is provided for users as individual scripts for each step in the process or as a single script where all processes, except the chi(2) step, are joined into one program called the 'backbone'.

  17. NASA's Earth Science Use of Commercially Availiable Remote Sensing Datasets: Cover Image

    NASA Technical Reports Server (NTRS)

    Underwood, Lauren W.; Goward, Samuel N.; Fearon, Matthew G.; Fletcher, Rose; Garvin, Jim; Hurtt, George

    2008-01-01

    The cover image incorporates high resolution stereo pairs acquired from the DigitalGlobe(R) QuickBird sensor. It shows a digital elevation model of Meteor Crater, Arizona at approximately 1.3 meter point-spacing. Image analysts used the Leica Photogrammetry Suite to produce the DEM. The outside portion was computed from two QuickBird panchromatic scenes acquired October 2006, while an Optech laser scan dataset was used for the crater s interior elevations. The crater s terrain model and image drape were created in a NASA Constellation Program project focused on simulating lunar surface environments for prototyping and testing lunar surface mission analysis and planning tools. This work exemplifies NASA s Scientific Data Purchase legacy and commercial high resolution imagery applications, as scientists use commercial high resolution data to examine lunar analog Earth landscapes for advanced planning and trade studies for future lunar surface activities. Other applications include landscape dynamics related to volcanism, hydrologic events, climate change, and ice movement.

  18. Scalable, Lightweight, Integrated and Quick-to-Assemble (SLIQ) Hyperdrives for Functional Circuit Dissection.

    PubMed

    Liang, Li; Oline, Stefan N; Kirk, Justin C; Schmitt, Lukas Ian; Komorowski, Robert W; Remondes, Miguel; Halassa, Michael M

    2017-01-01

    Independently adjustable multielectrode arrays are routinely used to interrogate neuronal circuit function, enabling chronic in vivo monitoring of neuronal ensembles in freely behaving animals at a single-cell, single spike resolution. Despite the importance of this approach, its widespread use is limited by highly specialized design and fabrication methods. To address this, we have developed a Scalable, Lightweight, Integrated and Quick-to-assemble multielectrode array platform. This platform additionally integrates optical fibers with independently adjustable electrodes to allow simultaneous single unit recordings and circuit-specific optogenetic targeting and/or manipulation. In current designs, the fully assembled platforms are scalable from 2 to 32 microdrives, and yet range 1-3 g, light enough for small animals. Here, we describe the design process starting from intent in computer-aided design, parameter testing through finite element analysis and experimental means, and implementation of various applications across mice and rats. Combined, our methods may expand the utility of multielectrode recordings and their continued integration with other tools enabling functional dissection of intact neural circuits.

  19. A simple tool for preliminary hazard identification and quick assessment in craftwork and small/medium enterprises (SME).

    PubMed

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2012-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded aimed at developing a "toolkit for MSD prevention" within IEA and in collaboration with World Health Organization (WHO). Possible users of toolkits are: members of health and safety committees, health and safety representatives, line supervisors; labor inspectors; health workers implementing basic occupational health services; occupational health and safety specialists.According to ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, a computer software ( in Excel®) was create dealing with hazard "mapping" in handicraft The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazard identification and risk estimation. Thus it makes possible to decide for which professional hazards a more exhaustive risk assessment will be necessary and which professional consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  20. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  1. Electronic tools for infectious diseases and microbiology

    PubMed Central

    Burdette, Steven D

    2007-01-01

    Electronic tools for infectious diseases and medical microbiology have the ability to change the way the diagnosis and treatment of infectious diseases are approached. Medical information today has the ability to be dynamic, keeping up with the latest research or clinical issues, instead of being static and years behind, as many textbooks are. The ability to rapidly disseminate information around the world opens up the possibility of communicating with people thousands of miles away to quickly and efficiently learn about emerging infections. Electronic tools have expanded beyond the desktop computer and the Internet, and now include personal digital assistants and other portable devices such as cellular phones. These pocket-sized devices have the ability to provide access to clinical information at the point of care. New electronic tools include e-mail listservs, electronic drug databases and search engines that allow focused clinical questions. The goal of the present article is to provide an overview of how electronic tools can impact infectious diseases and microbiology, while providing links and resources to allow users to maximize their efficiency in accessing this information. Links to the mentioned Web sites and programs are provided along with other useful electronic tools. PMID:18978984

  2. SU-C-BRA-03: An Automated and Quick Contour Errordetection for Auto Segmentation in Online Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J; Ates, O; Li, X

    Purpose: To develop a tool that can quickly and automatically assess contour quality generated from auto segmentation during online adaptive replanning. Methods: Due to the strict time requirement of online replanning and lack of ‘ground truth’ contours in daily images, our method starts with assessing image registration accuracy focusing on the surface of the organ in question. Several metrics tightly related to registration accuracy including Jacobian maps, contours shell deformation, and voxel-based root mean square (RMS) analysis were computed. To identify correct contours, additional metrics and an adaptive decision tree are introduced. To approve in principle, tests were performed withmore » CT sets, planned and daily CTs acquired using a CT-on-rails during routine CT-guided RT delivery for 20 prostate cancer patients. The contours generated on daily CTs using an auto-segmentation tool (ADMIRE, Elekta, MIM) based on deformable image registration of the planning CT and daily CT were tested. Results: The deformed contours of 20 patients with total of 60 structures were manually checked as baselines. The incorrect rate of total contours is 49%. To evaluate the quality of local deformation, the Jacobian determinant (1.047±0.045) on contours has been analyzed. In an analysis of rectum contour shell deformed, the higher rate (0.41) of error contours detection was obtained compared to 0.32 with manual check. All automated detections took less than 5 seconds. Conclusion: The proposed method can effectively detect contour errors in micro and macro scope by evaluating multiple deformable registration metrics in a parallel computing process. Future work will focus on improving practicability and optimizing calculation algorithms and metric selection.« less

  3. HiRel - Reliability/availability integrated workstation tool

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Dugan, Joanne B.

    1992-01-01

    The HiRel software tool is described and demonstrated by application to the mission avionics subsystem of the Advanced System Integration Demonstrations (ASID) system that utilizes the PAVE PILLAR approach. HiRel marks another accomplishment toward the goal of producing a totally integrated computer-aided design (CAD) workstation design capability. Since a reliability engineer generally represents a reliability model graphically before it can be solved, the use of a graphical input description language increases productivity and decreases the incidence of error. The graphical postprocessor module HARPO makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes. The addition of several powerful HARP modeling engines provides the user with a reliability/availability modeling capability for a wide range of system applications all integrated under a common interactive graphical input-output capability.

  4. orthoFind Facilitates the Discovery of Homologous and Orthologous Proteins.

    PubMed

    Mier, Pablo; Andrade-Navarro, Miguel A; Pérez-Pulido, Antonio J

    2015-01-01

    Finding homologous and orthologous protein sequences is often the first step in evolutionary studies, annotation projects, and experiments of functional complementation. Despite all currently available computational tools, there is a requirement for easy-to-use tools that provide functional information. Here, a new web application called orthoFind is presented, which allows a quick search for homologous and orthologous proteins given one or more query sequences, allowing a recurrent and exhaustive search against reference proteomes, and being able to include user databases. It addresses the protein multidomain problem, searching for homologs with the same domain architecture, and gives a simple functional analysis of the results to help in the annotation process. orthoFind is easy to use and has been proven to provide accurate results with different datasets. Availability: http://www.bioinfocabd.upo.es/orthofind/.

  5. Scratch Your Brain Where It Itches: Math Games, Tricks and Quick Activities, Book A-1.

    ERIC Educational Resources Information Center

    Brumbaugh, Linda

    This resource book contains mathematical games, tricks, and quick activities for the classroom. Categories include place value, number lines, basic facts and computation, computation and calculator practice, puzzles for tricky thinkers, and geometry. Ten classroom games and activities are found in the place value and number line sections, 27…

  6. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi-scenario assessment. The model computes the evolution of the water depth and velocities trough time in 2.5D. It provides maximum maps, intensity maps, and data from numerical gauge. This tool is developed for quick hazard assessment, thus it is efficient and requires little computational power. Its capacities are demonstrated on case studies.

  7. Prosthetic Tool For Holding Small Ferromagnetic Parts

    NASA Technical Reports Server (NTRS)

    Norton, William E.; Carden, James R.; Belcher, Jewell G., Jr.; Vest, Thomas W.

    1995-01-01

    Tool attached to prosthetic hand or arm enables user to hold nails, screws, nuts, rivets, and other small ferromagnetic objects on small magnetic tip. Device adjusted to hold nail or screw at proper angle for hammering or for use of screwdriver, respectively. Includes base connector with threaded outer surface and lower male member inserted in standard spring-action, quick-connect/quick-disconnect wrist adapter on prosthetic hand or arm.

  8. Large-scale feature searches of collections of medical imagery

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Karshat, Walter B.; Levitt, Tod S.; Vosky, D. N.

    1993-09-01

    Large scale feature searches of accumulated collections of medical imagery are required for multiple purposes, including clinical studies, administrative planning, epidemiology, teaching, quality improvement, and research. To perform a feature search of large collections of medical imagery, one can either search text descriptors of the imagery in the collection (usually the interpretation), or (if the imagery is in digital format) the imagery itself. At our institution, text interpretations of medical imagery are all available in our VA Hospital Information System. These are downloaded daily into an off-line computer. The text descriptors of most medical imagery are usually formatted as free text, and so require a user friendly database search tool to make searches quick and easy for any user to design and execute. We are tailoring such a database search tool (Liveview), developed by one of the authors (Karshat). To further facilitate search construction, we are constructing (from our accumulated interpretation data) a dictionary of medical and radiological terms and synonyms. If the imagery database is digital, the imagery which the search discovers is easily retrieved from the computer archive. We describe our database search user interface, with examples, and compare the efficacy of computer assisted imagery searches from a clinical text database with manual searches. Our initial work on direct feature searches of digital medical imagery is outlined.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in realmore » time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.« less

  10. The fundamentals behind solving for unknown molecular structures using computer-assisted structure elucidation: a free software package at the undergraduate and graduate levels.

    PubMed

    Moser, Arvin; Pautler, Brent G

    2016-05-15

    The successful elucidation of an unknown compound's molecular structure often requires an analyst with profound knowledge and experience of advanced spectroscopic techniques, such as Nuclear Magnetic Resonance (NMR) spectroscopy and mass spectrometry. The implementation of Computer-Assisted Structure Elucidation (CASE) software in solving for unknown structures, such as isolated natural products and/or reaction impurities, can serve both as elucidation and teaching tools. As such, the introduction of CASE software with 112 exercises to train students in conjunction with the traditional pen and paper approach will strengthen their overall understanding of solving unknowns and explore of various structural end points to determine the validity of the results quickly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. FUJIFILM X10 white orbs and DeOrbIt

    NASA Astrophysics Data System (ADS)

    Dietz, Henry Gordon

    2013-01-01

    The FUJIFILM X10 is a high-end enthusiast compact digital camera using an unusual sensor design. Unfortunately, upon its Fall 2011 release, the camera quickly became infamous for the uniquely disturbing "white orbs" that often appeared in areas where the sensor was saturated. FUJIFILM's first attempt at a fix was firmware released on February 25, 2012 if it had little effect. In April 2012, a sensor replacement essentially solved the problem. This paper explores the "white orb" phenomenon in detail. After FUJIFILM's attempt at a firmware fix failed, the author decided to create a post-processing tool that automatically could repair existing images. DeOrbIt was released as a free tool on March 7, 2012. To better understand the problem and how to fix it, the WWW form version of the tool logs images, processing parameters, and evaluations by users. The current paper describes the technical problem, the novel computational photography methods used by DeOrbit to repair affected images, and the public perceptions revealed by this experiment.

  12. A mobile tool about causes and distribution of dramatic natural phenomena

    NASA Astrophysics Data System (ADS)

    Boppidi, Ravikanth Reddy

    Most Research suggests that tablet computers could aid the study of many scientific concepts that are difficult to grasp, such as places, time and statistics. These occur especially in the study of geology, chemistry, biology and so on. Tapping the technology will soon become critical career training for future generations. Teaching through mobile is more interactive and helps students to grasp quickly. In this thesis an interactive mobile tool is developed which explains about the causes and distribution of natural disasters like Earthquakes, Tsunami, Tropical Cyclones, Volcanic Eruptions and Tornadoes. The application shows the places of disasters on an interactive map and it also contains YouTube embedded videos, which explain the disasters visually. The advantage of this tool is, it can be deployed onto major mobile operating systems like Android and IOS. The application's user interface (UI) is made very responsive using D3 JavaScript, JQuery, Java Script, HTML, CSS so that it can adapt to mobiles, tablets, and desktop screens.

  13. A study of computer graphics technology in application of communication resource management

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  14. Analysis of lead twist in modern high-performance grinding methods

    NASA Astrophysics Data System (ADS)

    Kundrák, J.; Gyáni, K.; Felhő, C.; Markopoulos, AP; Deszpoth, I.

    2016-11-01

    According to quality requirements of road vehicles shafts, which bear dynamic seals, twisted-pattern micro-geometrical topography is not allowed. It is a question whether newer modern grinding methods - such as quick-point grinding and peel grinding - could provide twist- free topography. According to industrial experience, twist-free surfaces can be made, however with certain settings, same twist occurs. In this paper it is proved by detailed chip-geometrical analysis that the topography generated by the new procedures is theoretically twist-patterned because of the feeding motion of the CBN tool. The presented investigation was carried out by a single-grain wheel model and computer simulation.

  15. Digital technologies for cognitive assessment to accelerate drug development in Alzheimer's disease.

    PubMed

    Leurent, C; Ehlers, M D

    2015-11-01

    For many neurological and psychiatric diseases, novel therapeutics have been elusive for decades. By focusing on attention interference in Alzheimer's disease (AD), we provide a future vision on how emerging mobile, computer, and device-based cognitive tools are converting classically noisy, subjective, data-poor clinical endpoints associated with neuropsychiatric disease assessment into a richer, scalable, and objective set of measurements. Incorporation of such endpoints into clinical drug trials holds promise for more quickly and efficiently developing new medicines. © 2015 The Authors. Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of The American Society for Clinical Pharmacology and Therapeutics.

  16. Microbiome Tools for Forensic Science.

    PubMed

    Metcalf, Jessica L; Xu, Zhenjiang Z; Bouslimani, Amina; Dorrestein, Pieter; Carter, David O; Knight, Rob

    2017-09-01

    Microbes are present at every crime scene and have been used as physical evidence for over a century. Advances in DNA sequencing and computational approaches have led to recent breakthroughs in the use of microbiome approaches for forensic science, particularly in the areas of estimating postmortem intervals (PMIs), locating clandestine graves, and obtaining soil and skin trace evidence. Low-cost, high-throughput technologies allow us to accumulate molecular data quickly and to apply sophisticated machine-learning algorithms, building generalizable predictive models that will be useful in the criminal justice system. In particular, integrating microbiome and metabolomic data has excellent potential to advance microbial forensics. Copyright © 2017. Published by Elsevier Ltd.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springmeyer, R R; Brugger, E; Cook, R

    The Data group provides data analysis and visualization support to its customers. This consists primarily of the development and support of VisIt, a data analysis and visualization tool. Support ranges from answering questions about the tool, providing classes on how to use the tool, and performing data analysis and visualization for customers. The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include applying visualization software for large scale data exploration; running video production labs on two networks; supporting graphics libraries and tools for end users;more » maintaining PowerWalls and assorted other displays; and developing software for searching and managing scientific data. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization techniques for large scale data exploration that are funded by the ASC program, among others. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry. The IMG group is located in the Terascale Simulation Facility, home to Dawn, Atlas, BGL, and others, which includes both classified and unclassified visualization theaters, a visualization computer floor and deployment workshop, and video production labs. We continued to provide the traditional graphics group consulting and video production support. We maintained five PowerWalls and many other displays. We deployed a 576-node Opteron/IB cluster with 72 TB of memory providing a visualization production server on our classified network. We continue to support a 128-node Opteron/IB cluster providing a visualization production server for our unclassified systems and an older 256-node Opteron/IB cluster for the classified systems, as well as several smaller clusters to drive the PowerWalls. The visualization production systems includes NFS servers to provide dedicated storage for data analysis and visualization. The ASC projects have delivered new versions of visualization and scientific data management tools to end users and continue to refine them. VisIt had 4 releases during the past year, ending with VisIt 2.0. We released version 2.4 of Hopper, a Java application for managing and transferring files. This release included a graphical disk usage view which works on all types of connections and an aggregated copy feature for quickly transferring massive datasets quickly and efficiently to HPSS. We continue to use and develop Blockbuster and Telepath. Both the VisIt and IMG teams were engaged in a variety of movie production efforts during the past year in addition to the development tasks.« less

  18. Sedentary behaviour and physical inactivity assessment in primary care: the Rapid Assessment Disuse Index (RADI) study.

    PubMed

    Shuval, Kerem; Kohl, Harold W; Bernstein, Ira; Cheng, Dunlei; Pettee Gabriel, Kelley; Barlow, Carolyn E; Yinghui, Liu; DiPietro, Loretta

    2014-02-01

    The emerging evidence of the effects of sedentary time on health outcomes suggests a need to better measure this exposure. Healthcare settings, however, are not equipped with a tool that can quickly assess the sedentary habits of their patient population. The purpose of this study was to validate a tool for rapidly quantifying and tracking the sedentary time and low levels of daily lifestyle physical activity among primary care patients. The study examined the test-retest reliability and validity of the rapid assessment disuse index (RADI) among adult patients from a large primary care clinic. Patients completed RADI (comprised of 3 items: sitting, moving and stair climbing) twice, followed by accelerometer monitoring. Test-retest reliability was computed, and the correlation between survey responses and accelerometry was determined. A receiver operating characteristic curve was constructed and the area under the curve (AUC) was calculated. RADI was temporally stable (intraclass correlation coefficients 0.79), and a higher score was significantly correlated with greater sedentary time (ρ=0.40; p<0.01), fewer sedentary to active transitions (ρ=-0.42; p<0.01), and less light-intensity physical activity (ρ=-0.40; p<0.01). The ability of RADI to detect patients with high levels of sedentary time was fair (AUC=0.72). This brief assessment tool, designed to quickly identify patients with high levels of sitting and low daily physical activity, exhibits good reliability and moderate validity. RADI can assist in providing recommendations at the point of care pertaining to modifying sedentary behaviour.

  19. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  20. Shuttle Debris Impact Tool Assessment Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, R.; Rayos, E. M.; Campbell, C. H.; Rickman, S. L.

    2006-01-01

    Computational tools have been developed to estimate thermal and mechanical reentry loads experienced by the Space Shuttle Orbiter as the result of cavities in the Thermal Protection System (TPS). Such cavities can be caused by impact from ice or insulating foam debris shed from the External Tank (ET) on liftoff. The reentry loads depend on cavity geometry and certain Shuttle state variables, among other factors. Certain simplifying assumptions have been made in the tool development about the cavity geometry variables. For example, the cavities are all modeled as shoeboxes , with rectangular cross-sections and planar walls. So an actual cavity is typically approximated with an idealized cavity described in terms of its length, width, and depth, as well as its entry angle, exit angle, and side angles (assumed to be the same for both sides). As part of a comprehensive assessment of the uncertainty in reentry loads estimated by the debris impact assessment tools, an effort has been initiated to quantify the component of the uncertainty that is due to imperfect geometry specifications for the debris impact cavities. The approach is to compute predicted loads for a set of geometry factor combinations sufficient to develop polynomial approximations to the complex, nonparametric underlying computational models. Such polynomial models are continuous and feature estimable, continuous derivatives, conditions that facilitate the propagation of independent variable errors. As an additional benefit, once the polynomial models have been developed, they require fewer computational resources to execute than the underlying finite element and computational fluid dynamics codes, and can generate reentry loads estimates in significantly less time. This provides a practical screening capability, in which a large number of debris impact cavities can be quickly classified either as harmless, or subject to additional analysis with the more comprehensive underlying computational tools. The polynomial models also provide useful insights into the sensitivity of reentry loads to various cavity geometry variables, and reveal complex interactions among those variables that indicate how the sensitivity of one variable depends on the level of one or more other variables. For example, the effect of cavity length on certain reentry loads depends on the depth of the cavity. Such interactions are clearly displayed in the polynomial response models.

  1. Climate Action Planning Tool | NREL

    Science.gov Websites

    NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to

  2. Thermomechanical conditions and stresses on the friction stir welding tool

    NASA Astrophysics Data System (ADS)

    Atthipalli, Gowtam

    Friction stir welding has been commercially used as a joining process for aluminum and other soft materials. However, the use of this process in joining of hard alloys is still developing primarily because of the lack of cost effective, long lasting tools. Here I have developed numerical models to understand the thermo mechanical conditions experienced by the FSW tool and to improve its reusability. A heat transfer and visco-plastic flow model is used to calculate the torque, and traverse force on the tool during FSW. The computed values of torque and traverse force are validated using the experimental results for FSW of AA7075, AA2524, AA6061 and Ti-6Al-4V alloys. The computed torque components are used to determine the optimum tool shoulder diameter based on the maximum use of torque and maximum grip of the tool on the plasticized workpiece material. The estimation of the optimum tool shoulder diameter for FSW of AA6061 and AA7075 was verified with experimental results. The computed values of traverse force and torque are used to calculate the maximum shear stress on the tool pin to determine the load bearing ability of the tool pin. The load bearing ability calculations are used to explain the failure of H13 steel tool during welding of AA7075 and commercially pure tungsten during welding of L80 steel. Artificial neural network (ANN) models are developed to predict the important FSW output parameters as function of selected input parameters. These ANN consider tool shoulder radius, pin radius, pin length, welding velocity, tool rotational speed and axial pressure as input parameters. The total torque, sliding torque, sticking torque, peak temperature, traverse force, maximum shear stress and bending stress are considered as the output for ANN models. These output parameters are selected since they define the thermomechanical conditions around the tool during FSW. The developed ANN models are used to understand the effect of various input parameters on the total torque and traverse force during FSW of AA7075 and 1018 mild steel. The ANN models are also used to determine tool safety factor for wide range of input parameters. A numerical model is developed to calculate the strain and strain rates along the streamlines during FSW. The strain and strain rate values are calculated for FSW of AA2524. Three simplified models are also developed for quick estimation of output parameters such as material velocity field, torque and peak temperature. The material velocity fields are computed by adopting an analytical method of calculating velocities for flow of non-compressible fluid between two discs where one is rotating and other is stationary. The peak temperature is estimated based on a non-dimensional correlation with dimensionless heat input. The dimensionless heat input is computed using known welding parameters and material properties. The torque is computed using an analytical function based on shear strength of the workpiece material. These simplified models are shown to be able to predict these output parameters successfully.

  3. PChopper: high throughput peptide prediction for MRM/SRM transition design.

    PubMed

    Afzal, Vackar; Huang, Jeffrey T-J; Atrih, Abdel; Crowther, Daniel J

    2011-08-15

    The use of selective reaction monitoring (SRM) based LC-MS/MS analysis for the quantification of phosphorylation stoichiometry has been rapidly increasing. At the same time, the number of sites that can be monitored in a single LC-MS/MS experiment is also increasing. The manual processes associated with running these experiments have highlighted the need for computational assistance to quickly design MRM/SRM candidates. PChopper has been developed to predict peptides that can be produced via enzymatic protein digest; this includes single enzyme digests, and combinations of enzymes. It also allows digests to be simulated in 'batch' mode and can combine information from these simulated digests to suggest the most appropriate enzyme(s) to use. PChopper also allows users to define the characteristic of their target peptides, and can automatically identify phosphorylation sites that may be of interest. Two application end points are available for interacting with the system; the first is a web based graphical tool, and the second is an API endpoint based on HTTP REST. Service oriented architecture was used to rapidly develop a system that can consume and expose several services. A graphical tool was built to provide an easy to follow workflow that allows scientists to quickly and easily identify the enzymes required to produce multiple peptides in parallel via enzymatic digests in a high throughput manner.

  4. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  5. Science preparedness and science response: perspectives on the dynamics of preparedness conference.

    PubMed

    Lant, Timothy; Lurie, Nicole

    2013-01-01

    The ability of the scientific modeling community to meaningfully contribute to postevent response activities during public health emergencies was the direct result of a discrete set of preparedness activities as well as advances in theory and technology. Scientists and decision-makers have recognized the value of developing scientific tools (e.g. models, data sets, communities of practice) to prepare them to be able to respond quickly--in a manner similar to preparedness activities by first-responders and emergency managers. Computational models have matured in their ability to better inform response plans by modeling human behaviors and complex systems. We advocate for further development of science preparedness activities as deliberate actions taken in advance of an unpredicted event (or an event with unknown consequences) to increase the scientific tools and evidence-base available to decision makers and the whole-of-community to limit adverse outcomes.

  6. Interactive Visualization of Computational Fluid Dynamics using Mosaic

    NASA Technical Reports Server (NTRS)

    Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)

    1994-01-01

    The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.

  7. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond.

    PubMed

    Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work.

  8. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond

    PubMed Central

    Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601

  9. Temperature and Material Flow Prediction in Friction-Stir Spot Welding of Advanced High-Strength Steel

    NASA Astrophysics Data System (ADS)

    Miles, M.; Karki, U.; Hovanski, Y.

    2014-10-01

    Friction-stir spot welding (FSSW) has been shown to be capable of joining advanced high-strength steel, with its flexibility in controlling the heat of welding and the resulting microstructure of the joint. This makes FSSW a potential alternative to resistance spot welding if tool life is sufficiently high, and if machine spindle loads are sufficiently low that the process can be implemented on an industrial robot. Robots for spot welding can typically sustain vertical loads of about 8 kN, but FSSW at tool speeds of less than 3000 rpm cause loads that are too high, in the range of 11-14 kN. Therefore, in the current work, tool speeds of 5000 rpm were employed to generate heat more quickly and to reduce welding loads to acceptable levels. Si3N4 tools were used for the welding experiments on 1.2-mm DP 980 steel. The FSSW process was modeled with a finite element approach using the Forge® software. An updated Lagrangian scheme with explicit time integration was employed to predict the flow of the sheet material, subjected to boundary conditions of a rotating tool and a fixed backing plate. Material flow was calculated from a velocity field that is two-dimensional, but heat generated by friction was computed by a novel approach, where the rotational velocity component imparted to the sheet by the tool surface was included in the thermal boundary conditions. An isotropic, viscoplastic Norton-Hoff law was used to compute the material flow stress as a function of strain, strain rate, and temperature. The model predicted welding temperatures to within 4%, and the position of the joint interface to within 10%, of the experimental results.

  10. Temperature and Material Flow Prediction in Friction-Stir Spot Welding of Advanced High-Strength Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miles, Michael; Karki, U.; Hovanski, Yuri

    Friction-stir spot welding (FSSW) has been shown to be capable of joining advanced high-strength steel, with its flexibility in controlling the heat of welding and the resulting microstructure of the joint. This makes FSSW a potential alternative to resistance spot welding if tool life is sufficiently high, and if machine spindle loads are sufficiently low that the process can be implemented on an industrial robot. Robots for spot welding can typically sustain vertical loads of about 8 kN, but FSSW at tool speeds of less than 3000 rpm cause loads that are too high, in the range of 11–14 kN.more » Therefore, in the current work, tool speeds of 5000 rpm were employed to generate heat more quickly and to reduce welding loads to acceptable levels. Si3N4 tools were used for the welding experiments on 1.2-mm DP 980 steel. The FSSW process was modeled with a finite element approach using the Forge* software. An updated Lagrangian scheme with explicit time integration was employed to predict the flow of the sheet material, subjected to boundary conditions of a rotating tool and a fixed backing plate. Material flow was calculated from a velocity field that is two-dimensional, but heat generated by friction was computed by a novel approach, where the rotational velocity component imparted to the sheet by the tool surface was included in the thermal boundary conditions. An isotropic, viscoplastic Norton-Hoff law was used to compute the material flow stress as a function of strain, strain rate, and temperature. The model predicted welding temperatures to within percent, and the position of the joint interface to within 10 percent, of the experimental results.« less

  11. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  12. Chemical-Help Application for Classification and Identification of Stormwater Constituents

    USGS Publications Warehouse

    Granato, Gregory E.; Driskell, Timothy R.; Nunes, Catherine

    2000-01-01

    A computer application called Chemical Help was developed to facilitate review of reports for the National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS). The application provides a tool to quickly find a proper classification for any constituent in the NDAMS review sheets. Chemical Help contents include the name of each water-quality property, constituent, or parameter, the section number within the NDAMS review sheet, the organizational levels within a classification hierarchy, the database number, and where appropriate, the chemical formula, the Chemical Abstract Service number, and a list of synonyms (for the organic chemicals). Therefore, Chemical Help provides information necessary to research available reference data for the water-quality properties and constituents of potential interest in stormwater studies. Chemical Help is implemented in the Microsoft help-system interface. (Computer files for the use and documentation of Chemical Help are included on an accompanying diskette.)

  13. Representing spatial information in a computational model for network management

    NASA Technical Reports Server (NTRS)

    Blaisdell, James H.; Brownfield, Thomas F.

    1994-01-01

    While currently available relational database management systems (RDBMS) allow inclusion of spatial information in a data model, they lack tools for presenting this information in an easily comprehensible form. Computer-aided design (CAD) software packages provide adequate functions to produce drawings, but still require manual placement of symbols and features. This project has demonstrated a bridge between the data model of an RDBMS and the graphic display of a CAD system. It is shown that the CAD system can be used to control the selection of data with spatial components from the database and then quickly plot that data on a map display. It is shown that the CAD system can be used to extract data from a drawing and then control the insertion of that data into the database. These demonstrations were successful in a test environment that incorporated many features of known working environments, suggesting that the techniques developed could be adapted for practical use.

  14. Jflow: a workflow management system for web applications.

    PubMed

    Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe

    2016-02-01

    Biologists produce large data sets and are in demand of rich and simple web portals in which they can upload and analyze their files. Providing such tools requires to mask the complexity induced by the needed High Performance Computing (HPC) environment. The connection between interface and computing infrastructure is usually specific to each portal. With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows. Jflow is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/jflow. The package is coming with full documentation, quick start and a running test portal. Jerome.Mariette@toulouse.inra.fr. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. OpenKIM - Building a Knowledgebase of Interatomic Models

    NASA Astrophysics Data System (ADS)

    Bierbaum, Matthew; Tadmor, Ellad; Elliott, Ryan; Wennblom, Trevor; Alemi, Alexander; Chen, Yan-Jiun; Karls, Daniel; Ludvik, Adam; Sethna, James

    2014-03-01

    The Knowledgebase of Interatomic Models (KIM) is an effort by the computational materials community to provide a standard interface for the development, characterization, and use of interatomic potentials. The KIM project has developed an API between simulation codes and interatomic models written in several different languages including C, Fortran, and Python. This interface is already supported in popular simulation environments such as LAMMPS and ASE, giving quick access to over a hundred compatible potentials that have been contributed so far. To compare and characterize models, we have developed a computational processing pipeline which automatically runs a series of tests for each model in the system, such as phonon dispersion relations and elastic constant calculations. To view the data from these tests, we created a rich set of interactive visualization tools located online. Finally, we created a Web repository to store and share these potentials, tests, and visualizations which can be found at https://openkim.org along with futher information.

  16. Design of a recovery system for a reentry vehicle

    NASA Technical Reports Server (NTRS)

    Von Eckroth, Wulf; Garrard, William L.; Miller, Norman

    1993-01-01

    Engineers are often required to design decelerator systems which are deployed in cross-wind orientations. If the system is not designed to minimize 'line sail', damage to the parachutes could result. A Reentry Vehicle Analysis Code (RVAC) and an accompanying graphics animation software program (DISPLAY) are presented in this paper. These computer codes allow the user to quickly apply the Purvis line sail modeling technique to any vehicle and then observe the relative motion of the vehicle, nose cap, suspension lines, pilot and drogue bags and canopies on a computer screen. Data files are created which allow plots of velocities, spacial positions, and dynamic pressures versus time to be generated. The code is an important tool for the design engineer because it integrates two degrees of freedom (DOF) line sail equations with a three DOF model of the reentry body and jettisoned nose cap to provide an animated output.

  17. An efficient algorithm for computing fixed length attractors based on bounded model checking in synchronous Boolean networks with biochemical applications.

    PubMed

    Li, X Y; Yang, G W; Zheng, D S; Guo, W S; Hung, W N N

    2015-04-28

    Genetic regulatory networks are the key to understanding biochemical systems. One condition of the genetic regulatory network under different living environments can be modeled as a synchronous Boolean network. The attractors of these Boolean networks will help biologists to identify determinant and stable factors. Existing methods identify attractors based on a random initial state or the entire state simultaneously. They cannot identify the fixed length attractors directly. The complexity of including time increases exponentially with respect to the attractor number and length of attractors. This study used the bounded model checking to quickly locate fixed length attractors. Based on the SAT solver, we propose a new algorithm for efficiently computing the fixed length attractors, which is more suitable for large Boolean networks and numerous attractors' networks. After comparison using the tool BooleNet, empirical experiments involving biochemical systems demonstrated the feasibility and efficiency of our approach.

  18. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    PubMed

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-08-14

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  19. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    PubMed Central

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-01-01

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873

  20. EarthCube: A Community-Driven Cyberinfrastructure for the Geosciences

    NASA Astrophysics Data System (ADS)

    Koskela, Rebecca; Ramamurthy, Mohan; Pearlman, Jay; Lehnert, Kerstin; Ahern, Tim; Fredericks, Janet; Goring, Simon; Peckham, Scott; Powers, Lindsay; Kamalabdi, Farzad; Rubin, Ken; Yarmey, Lynn

    2017-04-01

    EarthCube is creating a dynamic, System of Systems (SoS) infrastructure and data tools to collect, access, analyze, share, and visualize all forms of geoscience data and resources, using advanced collaboration, technological, and computational capabilities. EarthCube, as a joint effort between the U.S. National Science Foundation Directorate for Geosciences and the Division of Advanced Cyberinfrastructure, is a quickly growing community of scientists across all geoscience domains, as well as geoinformatics researchers and data scientists. EarthCube has attracted an evolving, dynamic virtual community of more than 2,500 contributors, including earth, ocean, polar, planetary, atmospheric, geospace, computer and social scientists, educators, and data and information professionals. During 2017, EarthCube will transition to the implementation phase. The implementation will balance "innovation" and "production" to advance cross-disciplinary science goals as well as the development of future data scientists. This presentation will describe the current architecture design for the EarthCube cyberinfrastructure and implementation plan.

  1. Experience Report: A Do-It-Yourself High-Assurance Compiler

    NASA Technical Reports Server (NTRS)

    Pike, Lee; Wegmann, Nis; Niller, Sebastian; Goodloe, Alwyn

    2012-01-01

    Embedded domain-specific languages (EDSLs) are an approach for quickly building new languages while maintaining the advantages of a rich metalanguage. We argue in this experience report that the "EDSL approach" can surprisingly ease the task of building a high-assurance compiler.We do not strive to build a fully formally-verified tool-chain, but take a "do-it-yourself" approach to increase our confidence in compiler-correctness without too much effort. Copilot is an EDSL developed by Galois, Inc. and the National Institute of Aerospace under contract to NASA for the purpose of runtime monitoring of flight-critical avionics. We report our experience in using type-checking, QuickCheck, and model-checking "off-the-shelf" to quickly increase confidence in our EDSL tool-chain.

  2. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  3. CFD Extraction Tool for TecPlot From DPLR Solutions

    NASA Technical Reports Server (NTRS)

    Norman, David

    2013-01-01

    This invention is a TecPlot macro of a computer program in the TecPlot programming language that processes data from DPLR solutions in TecPlot format. DPLR (Data-Parallel Line Relaxation) is a NASA computational fluid dynamics (CFD) code, and TecPlot is a commercial CFD post-processing tool. The Tec- Plot data is in SI units (same as DPLR output). The invention converts the SI units into British units. The macro modifies the TecPlot data with unit conversions, and adds some extra calculations. After unit conversions, the macro cuts a slice, and adds vectors on the current plot for output format. The macro can also process surface solutions. Existing solutions use manual conversion and superposition. The conversion is complicated because it must be applied to a range of inter-related scalars and vectors to describe a 2D or 3D flow field. It processes the CFD solution to create superposition/comparison of scalars and vectors. The existing manual solution is cumbersome, open to errors, slow, and cannot be inserted into an automated process. This invention is quick and easy to use, and can be inserted into an automated data-processing algorithm.

  4. Sine-Bar Attachment For Machine Tools

    NASA Technical Reports Server (NTRS)

    Mann, Franklin D.

    1988-01-01

    Sine-bar attachment for collets, spindles, and chucks helps machinists set up quickly for precise angular cuts that require greater precision than provided by graduations of machine tools. Machinist uses attachment to index head, carriage of milling machine or lathe relative to table or turning axis of tool. Attachment accurate to 1 minute or arc depending on length of sine bar and precision of gauge blocks in setup. Attachment installs quickly and easily on almost any type of lathe or mill. Requires no special clamps or fixtures, and eliminates many trial-and-error measurements. More stable than improvised setups and not jarred out of position readily.

  5. Interactive Software For Astrodynamical Calculations

    NASA Technical Reports Server (NTRS)

    Schlaifer, Ronald S.; Skinner, David L.; Roberts, Phillip H.

    1995-01-01

    QUICK computer program provides user with facilities of sophisticated desk calculator performing scalar, vector, and matrix arithmetic; propagate conic-section orbits; determines planetary and satellite coordinates; and performs other related astrodynamic calculations within FORTRAN-like software environment. QUICK is interpreter, and no need to use compiler or linker to run QUICK code. Outputs plotted in variety of formats on variety of terminals. Written in RATFOR.

  6. SU-F-J-192: A Quick and Effective Method to Validate Patient’s Daily Setup and Geometry Changes Prior to Proton Treatment Delivery Based On Water Equivalent Thickness Projection Imaging (WETPI) for Head Neck Cancer (HNC) Patient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, G; Qin, A; Zhang, J

    Purpose: With the implementation of Cone-beam Computed-Tomography (CBCT) in proton treatment, we introduces a quick and effective tool to verify the patient’s daily setup and geometry changes based on the Water-Equivalent-Thickness Projection-Image(WETPI) from individual beam angle. Methods: A bilateral head neck cancer(HNC) patient previously treated via VMAT was used in this study. The patient received 35 daily CBCT during the whole treatment and there is no significant weight change. The CT numbers of daily CBCTs were corrected by mapping the CT numbers from simulation CT via Deformable Image Registration(DIR). IMPT plan was generated using 4-field IMPT robust optimization (3.5% rangemore » and 3mm setup uncertainties) with beam angle 60, 135, 300, 225 degree. WETPI within CTV through all beam directions were calculated. 3%/3mm gamma index(GI) were used to provide a quantitative comparison between initial sim-CT and mapped daily CBCT. To simulate an extreme case where human error is involved, a couch bar was manually inserted in front of beam angle 225 degree of one CBCT. WETPI was compared in this scenario. Results: The average of GI passing rate of this patient from different beam angles throughout the treatment course is 91.5 ± 8.6. In the cases with low passing rate, it was found that the difference between shoulder and neck angle as well as the head rest often causes major deviation. This indicates that the most challenge in treating HNC is the setup around neck area. In the extreme case where a couch bar is accidently inserted in the beam line, GI passing rate drops to 52 from 95. Conclusion: WETPI and quantitative gamma analysis give clinicians, therapists and physicists a quick feedback of the patient’s setup accuracy or geometry changes. The tool could effectively avoid some human errors. Furthermore, this tool could be used potentially as an initial signal to trigger plan adaptation.« less

  7. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  8. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  9. Device Rotates Bearing Balls For Inspection

    NASA Technical Reports Server (NTRS)

    Burley, R. K.

    1988-01-01

    Entire surface of ball inspected automatically and quickly. Device holds and rotates bearing ball for inspection by optical or mechanical surface-quality probe, eddy-current probe for detection of surface or subsurface defects, or circumference-measuring tool. Ensures entire surface of ball moves past inspection head quickly. New device saves time and increases reliability of inspections of spherical surfaces. Simple to operate and provides quick and easy access for loading and unloading of balls during inspection.

  10. Multi-modal imaging, model-based tracking, and mixed reality visualisation for orthopaedic surgery

    PubMed Central

    Fuerst, Bernhard; Tateno, Keisuke; Johnson, Alex; Fotouhi, Javad; Osgood, Greg; Tombari, Federico; Navab, Nassir

    2017-01-01

    Orthopaedic surgeons are still following the decades old workflow of using dozens of two-dimensional fluoroscopic images to drill through complex 3D structures, e.g. pelvis. This Letter presents a mixed reality support system, which incorporates multi-modal data fusion and model-based surgical tool tracking for creating a mixed reality environment supporting screw placement in orthopaedic surgery. A red–green–blue–depth camera is rigidly attached to a mobile C-arm and is calibrated to the cone-beam computed tomography (CBCT) imaging space via iterative closest point algorithm. This allows real-time automatic fusion of reconstructed surface and/or 3D point clouds and synthetic fluoroscopic images obtained through CBCT imaging. An adapted 3D model-based tracking algorithm with automatic tool segmentation allows for tracking of the surgical tools occluded by hand. This proposed interactive 3D mixed reality environment provides an intuitive understanding of the surgical site and supports surgeons in quickly localising the entry point and orienting the surgical tool during screw placement. The authors validate the augmentation by measuring target registration error and also evaluate the tracking accuracy in the presence of partial occlusion. PMID:29184659

  11. Special tool kit aids heavily garmented workers

    NASA Technical Reports Server (NTRS)

    Holmes, A. E.

    1966-01-01

    Triangular aluminum tool kit, filled with polyurethane is constructed to receive various tools and hold them in a snug but quick-release fit as an aid to heavily gloved workers. The kit is designed to allow mounting within easily accessable reach and to provide protection of the tools during storage.

  12. Systems Prototyping with Fourth Generation Tools.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis

    1983-01-01

    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  13. Vehicle assisted harpoon breaching tool

    DOEpatents

    Pacheco, James E [Albuquerque, NM; Highland, Steven E [Albuquerque, NM

    2011-02-15

    A harpoon breaching tool that allows security officers, SWAT teams, police, firemen, soldiers, or others to forcibly breach metal doors or walls very quickly (in a few seconds), without explosives. The harpoon breaching tool can be mounted to a vehicle's standard receiver hitch.

  14. Phantom dosimetry and image quality of i-CAT FLX cone-beam computed tomography

    PubMed Central

    Ludlow, John B.; Walker, Cameron

    2013-01-01

    Introduction Increasing use of cone-beam computed tomography in orthodontics has been coupled with heightened concern with the long-term risks of x-ray exposure in orthodontic populations. An industry response to this has been to offer low-exposure alternative scanning options in newer cone-beam computed tomography models. Methods Effective doses resulting from various combinations of field size, and field location comparing child and adult anthropomorphic phantoms using the recently introduced i-CAT FLX cone-beam computed tomography unit were measured with Optical Stimulated Dosimetry using previously validated protocols. Scan protocols included High Resolution (360° rotation, 600 image frames, 120 kVp, 5 mA, 7.4 sec), Standard (360°, 300 frames, 120 kVp, 5 mA, 3.7 sec), QuickScan (180°, 160 frames, 120 kVp, 5 mA, 2 sec) and QuickScan+ (180°, 160 frames, 90 kVp, 3 mA, 2 sec). Contrast-to-noise ratio (CNR) was calculated as a quantitative measure of image quality for the various exposure options using the QUART DVT phantom. Results Child phantom doses were on average 36% greater than Adult phantom doses. QuickScan+ protocols resulted in significantly lower doses than Standard protocols for child (p=0.0167) and adult (p=0.0055) phantoms. 13×16 cm cephalometric fields of view ranged from 11–85 μSv in the adult phantom and 18–120 μSv in the child for QuickScan+ and Standard protocols respectively. CNR was reduced by approximately 2/3rds comparing QuickScan+ to Standard exposure parameters. Conclusions QuickScan+ effective doses are comparable to conventional panoramic examinations. Significant dose reductions are accompanied by significant reductions in image quality. However, this trade-off may be acceptable for certain diagnostic tasks such as interim assessment of treatment results. PMID:24286904

  15. Spinoff 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.

  16. Quick Information Sheets.

    ERIC Educational Resources Information Center

    Wisconsin Univ., Madison. Trace Center.

    This compilation of "Trace Quick Sheets" provides descriptions, prices, and ordering information for products and services that assist with communication, control, and computer access for disabled individuals. Product descriptions or product sources are included for: adaptive toys and toy modifications; head pointers, light pointers, and…

  17. CRADA Final Report: Weld Predictor App

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billings, Jay Jay

    Welding is an important manufacturing process used in a broad range of industries and market sectors, including automotive, aerospace, heavy manufacturing, medical, and defense. During welded fabrication, high localized heat input and subsequent rapid cooling result in the creation of residual stresses and distortion. These residual stresses can significantly affect the fatigue resistance, cracking behavior, and load-carrying capacity of welded structures during service. Further, additional fitting and tacking time is often required to fit distorted subassemblies together, resulting in non-value added cost. Using trial-and-error methods to determine which welding parameters, welding sequences, and fixture designs will most effectively reduce distortionmore » is a time-consuming and expensive process. For complex structures with many welds, this approach can take several months. For this reason, efficient and accurate methods of mitigating distortion are in-demand across all industries where welding is used. Analytical and computational methods and commercial software tools have been developed to predict welding-induced residual stresses and distortion. Welding process parameters, fixtures, and tooling can be optimized to reduce the HAZ softening and minimize weld residual stress and distortion, improving performance and reducing design, fabrication and testing costs. However, weld modeling technology tools are currently accessible only to engineers and designers with a background in finite element analysis (FEA) who work with large manufacturers, research institutes, and universities with access to high-performance computing (HPC) resources. Small and medium enterprises (SMEs) in the US do not typically have the human and computational resources needed to adopt and utilize weld modeling technology. To allow an engineer with no background in FEA and SMEs to gain access to this important design tool, EWI and the Ohio Supercomputer Center (OSC) developed the online weld application software tool “WeldPredictor” ( https://eweldpredictor.ewi.org ). About 1400 users have tested this application. This project marked the beginning of development on the next version of WeldPredictor that addresses many outstanding features of the original, including 3D models, allow more material hardening laws, model material phase transformation, and uses open source finite element solvers to quickly solve problems (as opposed to expensive commercial tools).« less

  18. QuickProbs—A Fast Multiple Sequence Alignment Algorithm Designed for Graphics Processors

    PubMed Central

    Gudyś, Adam; Deorowicz, Sebastian

    2014-01-01

    Multiple sequence alignment is a crucial task in a number of biological analyses like secondary structure prediction, domain searching, phylogeny, etc. MSAProbs is currently the most accurate alignment algorithm, but its effectiveness is obtained at the expense of computational time. In the paper we present QuickProbs, the variant of MSAProbs customised for graphics processors. We selected the two most time consuming stages of MSAProbs to be redesigned for GPU execution: the posterior matrices calculation and the consistency transformation. Experiments on three popular benchmarks (BAliBASE, PREFAB, OXBench-X) on quad-core PC equipped with high-end graphics card show QuickProbs to be 5.7 to 9.7 times faster than original CPU-parallel MSAProbs. Additional tests performed on several protein families from Pfam database give overall speed-up of 6.7. Compared to other algorithms like MAFFT, MUSCLE, or ClustalW, QuickProbs proved to be much more accurate at similar speed. Additionally we introduce a tuned variant of QuickProbs which is significantly more accurate on sets of distantly related sequences than MSAProbs without exceeding its computation time. The GPU part of QuickProbs was implemented in OpenCL, thus the package is suitable for graphics processors produced by all major vendors. PMID:24586435

  19. Conformational analysis by intersection: CONAN.

    PubMed

    Smellie, Andrew; Stanton, Robert; Henne, Randy; Teig, Steve

    2003-01-15

    As high throughput techniques in chemical synthesis and screening improve, more demands are placed on computer assisted design and virtual screening. Many of these computational methods require one or more three-dimensional conformations for molecules, creating a demand for a conformational analysis tool that can rapidly and robustly cover the low-energy conformational spaces of small molecules. A new algorithm of intersection is presented here, which quickly generates (on average <0.5 seconds/stereoisomer) a complete description of the low energy conformational space of a small molecule. The molecule is first decomposed into nonoverlapping nodes N (usually rings) and overlapping paths P with conformations (N and P) generated in an offline process. In a second step the node and path data are combined to form distinct conformers of the molecule. Finally, heuristics are applied after intersection to generate a small representative collection of conformations that span the conformational space. In a study of approximately 97,000 randomly selected molecules from the MDDR, results are presented that explore these conformations and their ability to cover low-energy conformational space. Copyright 2002 Wiley Periodicals, Inc. J Comput Chem 24: 10-20, 2003

  20. TopoDrive and ParticleFlow--Two Computer Models for Simulation and Visualization of Ground-Water Flow and Transport of Fluid Particles in Two Dimensions

    USGS Publications Warehouse

    Hsieh, Paul A.

    2001-01-01

    This report serves as a user?s guide for two computer models: TopoDrive and ParticleFlow. These two-dimensional models are designed to simulate two ground-water processes: topography-driven flow and advective transport of fluid particles. To simulate topography-driven flow, the user may specify the shape of the water table, which bounds the top of the vertical flow section. To simulate transport of fluid particles, the model domain is a rectangle with overall flow from left to right. In both cases, the flow is under steady state, and the distribution of hydraulic conductivity may be specified by the user. The models compute hydraulic head, ground-water flow paths, and the movement of fluid particles. An interactive visual interface enables the user to easily and quickly explore model behavior, and thereby better understand ground-water flow processes. In this regard, TopoDrive and ParticleFlow are not intended to be comprehensive modeling tools, but are designed for modeling at the exploratory or conceptual level, for visual demonstration, and for educational purposes.

  1. Expression and Purification of a Novel Computationally Designed Antigen for Simultaneously Detection of HTLV-1 and HBV Antibodies.

    PubMed

    Heydari Zarnagh, Hafez; Ravanshad, Mehrdad; Pourfatollah, Ali Akbar; Rasaee, Mohammad Javad

    2015-04-01

    Computational tools are reliable alternatives to laborious work in chimeric protein design. In this study, a chimeric antigen was designed using computational techniques for simultaneous detection of anti-HTLV-I and anti-HBV in infected sera. Databases were searched for amino acid sequences of HBV/HLV-I diagnostic antigens. The immunodominant fragments were selected based on propensity scales. The diagnostic antigen was designed using these fragments. Secondary and tertiary structures were predicted and the B-cell epitopes were mapped on the surface of built model. The synthetic DNA coding antigen was sub-cloned into pGS21a expression vector. SDS-PAGE analysis showed that glutathione fused antigen was highly expressed in E. coli BL21 (DE3) cells. The recombinant antigen was purified by nickel affinity chromatography. ELISA results showed that soluble antigen could specifically react with the HTLV-I and HBV infected sera. This specific antigen could be used as suitable agent for antibody-antigen based screening tests and can help clinicians in order to perform quick and precise screening of the HBV and HTLV-I infections.

  2. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-05-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  3. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-01-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  4. Quickly Approximating the Distance Between Two Objects

    NASA Technical Reports Server (NTRS)

    Hammen, David

    2009-01-01

    A method of quickly approximating the distance between two objects (one smaller, regarded as a point; the other larger and complexly shaped) has been devised for use in computationally simulating motions of the objects for the purpose of planning the motions to prevent collisions.

  5. Unified Geophysical Cloud Platform (UGCP) for Seismic Monitoring and other Geophysical Applications.

    NASA Astrophysics Data System (ADS)

    Synytsky, R.; Starovoit, Y. O.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.

    2016-12-01

    We present Unified Geophysical Cloud Platform (UGCP) or UniGeoCloud as an innovative approach for geophysical data processing in the Cloud environment with the ability to run any type of data processing software in isolated environment within the single Cloud platform. We've developed a simple and quick method of several open-source widely known software seismic packages (SeisComp3, Earthworm, Geotool, MSNoise) installation which does not require knowledge of system administration, configuration, OS compatibility issues etc. and other often annoying details preventing time wasting for system configuration work. Installation process is simplified as "mouse click" on selected software package from the Cloud market place. The main objective of the developed capability was the software tools conception with which users are able to design and install quickly their own highly reliable and highly available virtual IT-infrastructure for the organization of seismic (and in future other geophysical) data processing for either research or monitoring purposes. These tools provide access to any seismic station data available in open IP configuration from the different networks affiliated with different Institutions and Organizations. It allows also setting up your own network as you desire by selecting either regionally deployed stations or the worldwide global network based on stations selection form the global map. The processing software and products and research results could be easily monitored from everywhere using variety of user's devices form desk top computers to IT gadgets. Currents efforts of the development team are directed to achieve Scalability, Reliability and Sustainability (SRS) of proposed solutions allowing any user to run their applications with the confidence of no data loss and no failure of the monitoring or research software components. The system is suitable for quick rollout of NDC-in-Box software package developed for State Signatories and aimed for promotion of data processing collected by the IMS Network.

  6. Toolkit of Available EPA Green Infrastructure Modeling ...

    EPA Pesticide Factsheets

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).

  7. Steady-State Computation of Constant Rotational Rate Dynamic Stability Derivatives

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Green, Lawrence L.

    2000-01-01

    Dynamic stability derivatives are essential to predicting the open and closed loop performance, stability, and controllability of aircraft. Computational determination of constant-rate dynamic stability derivatives (derivatives of aircraft forces and moments with respect to constant rotational rates) is currently performed indirectly with finite differencing of multiple time-accurate computational fluid dynamics solutions. Typical time-accurate solutions require excessive amounts of computational time to complete. Formulating Navier-Stokes (N-S) equations in a rotating noninertial reference frame and applying an automatic differentiation tool to the modified code has the potential for directly computing these derivatives with a single, much faster steady-state calculation. The ability to rapidly determine static and dynamic stability derivatives by computational methods can benefit multidisciplinary design methodologies and reduce dependency on wind tunnel measurements. The CFL3D thin-layer N-S computational fluid dynamics code was modified for this study to allow calculations on complex three-dimensional configurations with constant rotation rate components in all three axes. These CFL3D modifications also have direct application to rotorcraft and turbomachinery analyses. The modified CFL3D steady-state calculation is a new capability that showed excellent agreement with results calculated by a similar formulation. The application of automatic differentiation to CFL3D allows the static stability and body-axis rate derivatives to be calculated quickly and exactly.

  8. Current and anticipated uses of the thermal hydraulics codes at the NRC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caruso, R.

    1997-07-01

    The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less

  9. The Father Christmas worm

    NASA Technical Reports Server (NTRS)

    Green, James L.; Sisson, Patricia L.

    1989-01-01

    Given here is an overview analysis of the Father Christmas Worm, a computer worm that was released onto the DECnet Internet three days before Christmas 1988. The purpose behind the worm was to send an electronic mail message to all users on the computer system running the worm. The message was a Christmas greeting and was signed 'Father Christmas'. From the investigation, it was determined that the worm was released from a computer (node number 20597::) at a university in Switzerland. The worm was designed to travel quickly. Estimates are that it was copied to over 6,000 computer nodes. However, it was believed to have executed on only a fraction of those computers. Within ten minutes after it was released, the worm was detected at the Space Physics Analysis Network (SPAN), NASA's largest space and Earth science network. Once the source program was captured, a procedural cure, using the existing functionality of the computer operating systems, was quickly devised and distributed. A combination of existing computer security measures, the quick and accurate procedures devised to stop copies of the worm from executing, and the network itself, were used to rapidly provide the cure. These were the main reasons why the worm executed on such a small percentage of nodes. This overview of the analysis of the events concerning the worm is based on an investigation made by the SPAN Security Team and provides some insight into future security measures that will be taken to handle computer worms and viruses that may hit similar networks.

  10. Wing Leading Edge RCC Rapid Response Damage Prediction Tool (IMPACT2)

    NASA Technical Reports Server (NTRS)

    Clark, Robert; Cottter, Paul; Michalopoulos, Constantine

    2013-01-01

    This rapid response computer program predicts Orbiter Wing Leading Edge (WLE) damage caused by ice or foam impact during a Space Shuttle launch (Program "IMPACT2"). The program was developed after the Columbia accident in order to assess quickly WLE damage due to ice, foam, or metal impact (if any) during a Shuttle launch. IMPACT2 simulates an impact event in a few minutes for foam impactors, and in seconds for ice and metal impactors. The damage criterion is derived from results obtained from one sophisticated commercial program, which requires hours to carry out simulations of the same impact events. The program was designed to run much faster than the commercial program with prediction of projectile threshold velocities within 10 to 15% of commercial-program values. The mathematical model involves coupling of Orbiter wing normal modes of vibration to nonlinear or linear springmass models. IMPACT2 solves nonlinear or linear impact problems using classical normal modes of vibration of a target, and nonlinear/ linear time-domain equations for the projectile. Impact loads and stresses developed in the target are computed as functions of time. This model is novel because of its speed of execution. A typical model of foam, or other projectile characterized by material nonlinearities, impacting an RCC panel is executed in minutes instead of hours needed by the commercial programs. Target damage due to impact can be assessed quickly, provided that target vibration modes and allowable stress are known.

  11. Rapid antigen detection test for respiratory syncytial virus diagnosis as a diagnostic tool.

    PubMed

    Mesquita, Flávio da Silva; Oliveira, Danielle Bruna Leal de; Crema, Daniela; Pinez, Célia Miranda Nunes; Colmanetti, Thaís Cristina; Thomazelli, Luciano Matsumia; Gilio, Alfredo Elias; Vieira, Sandra Elisabeth; Martinez, Marina Baquerizo; Botosso, Viviane Fongaro; Durigon, Edison Luiz

    The aim of this study was to evaluate the QuickVue ® RSV Test Kit (QUIDEL Corp, CA, USA) as a screening tool for respiratory syncytial virus in children with acute respiratory disease in comparison with the indirect immunofluorescence assay as gold standard. In Brazil, rapid antigen detection tests for respiratory syncytial virus are not routinely utilized as a diagnostic tool, except for the diagnosis of dengue and influenza. The authors retrospectively analyzed 486 nasopharyngeal aspirate samples from children under age 5 with acute respiratory infection, between December 2013 and August 2014, the samples were analyzed by indirect immunofluorescence assay and QuickVue ® RSV Test kit. Samples with discordant results were analyzed by real time PCR and nucleotide sequencing. From 313 positive samples by immunofluorescence assays, 282 (90%) were also positive by the rapid antigen detection test, two were positive only by rapid antigen detection test, 33 were positive only by immunofluorescence assays, and 171 were positive by both methods. The 35 samples with discordant results were analyzed by real time PCR; the two samples positive only by rapid antigen detection test and the five positive only by immunofluorescence assays were also positive by real time PCR. There was no relation between the negativity by QuickVue ® RSV Test and viral load or specific strain. The QuickVue ® RSV Test showed sensitivity of 90%, specificity of 98.8%, predictive positive value of 99.3%, and negative predictive value of 94.6%, with accuracy of 93.2% and agreement κ index of 0.85 in comparison to immunofluorescence assay. This study demonstrated that the QuickVue ® RSV Test Kit can be effective in early detection of Respiratory syncytial virus in nasopharyngeal aspirate and is reliable for use as a diagnostic tool in pediatrics. Copyright © 2016 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  12. Nonsequential Computation and Laws of Nature.

    DTIC Science & Technology

    1986-05-01

    computing engines arose as a byproduct of the Manhattan Project in World War II. Broadly speaking, their purpose was to compute numerical solutions to...nature, and to representing algorithms in structures of space and time. After the Manhattan Project had been fulfilled, computer designers quickly pro

  13. PIPI: PTM-Invariant Peptide Identification Using Coding Method.

    PubMed

    Yu, Fengchao; Li, Ning; Yu, Weichuan

    2016-12-02

    In computational proteomics, the identification of peptides with an unlimited number of post-translational modification (PTM) types is a challenging task. The computational cost associated with database search increases exponentially with respect to the number of modified amino acids and linearly with respect to the number of potential PTM types at each amino acid. The problem becomes intractable very quickly if we want to enumerate all possible PTM patterns. To address this issue, one group of methods named restricted tools (including Mascot, Comet, and MS-GF+) only allow a small number of PTM types in database search process. Alternatively, the other group of methods named unrestricted tools (including MS-Alignment, ProteinProspector, and MODa) avoids enumerating PTM patterns with an alignment-based approach to localizing and characterizing modified amino acids. However, because of the large search space and PTM localization issue, the sensitivity of these unrestricted tools is low. This paper proposes a novel method named PIPI to achieve PTM-invariant peptide identification. PIPI belongs to the category of unrestricted tools. It first codes peptide sequences into Boolean vectors and codes experimental spectra into real-valued vectors. For each coded spectrum, it then searches the coded sequence database to find the top scored peptide sequences as candidates. After that, PIPI uses dynamic programming to localize and characterize modified amino acids in each candidate. We used simulation experiments and real data experiments to evaluate the performance in comparison with restricted tools (i.e., Mascot, Comet, and MS-GF+) and unrestricted tools (i.e., Mascot with error tolerant search, MS-Alignment, ProteinProspector, and MODa). Comparison with restricted tools shows that PIPI has a close sensitivity and running speed. Comparison with unrestricted tools shows that PIPI has the highest sensitivity except for Mascot with error tolerant search and ProteinProspector. These two tools simplify the task by only considering up to one modified amino acid in each peptide, which results in a higher sensitivity but has difficulty in dealing with multiple modified amino acids. The simulation experiments also show that PIPI has the lowest false discovery proportion, the highest PTM characterization accuracy, and the shortest running time among the unrestricted tools.

  14. Applications of Phase-Based Motion Processing

    NASA Technical Reports Server (NTRS)

    Branch, Nicholas A.; Stewart, Eric C.

    2018-01-01

    Image pyramids provide useful information in determining structural response at low cost using commercially available cameras. The current effort applies previous work on the complex steerable pyramid to analyze and identify imperceptible linear motions in video. Instead of implicitly computing motion spectra through phase analysis of the complex steerable pyramid and magnifying the associated motions, instead present a visual technique and the necessary software to display the phase changes of high frequency signals within video. The present technique quickly identifies regions of largest motion within a video with a single phase visualization and without the artifacts of motion magnification, but requires use of the computationally intensive Fourier transform. While Riesz pyramids present an alternative to the computationally intensive complex steerable pyramid for motion magnification, the Riesz formulation contains significant noise, and motion magnification still presents large amounts of data that cannot be quickly assessed by the human eye. Thus, user-friendly software is presented for quickly identifying structural response through optical flow and phase visualization in both Python and MATLAB.

  15. GenomePeek—an online tool for prokaryotic genome and metagenome analysis

    DOE PAGES

    McNair, Katelyn; Edwards, Robert A.

    2015-06-16

    As increases in prokaryotic sequencing take place, a method to quickly and accurately analyze this data is needed. Previous tools are mainly designed for metagenomic analysis and have limitations; such as long runtimes and significant false positive error rates. The online tool GenomePeek (edwards.sdsu.edu/GenomePeek) was developed to analyze both single genome and metagenome sequencing files, quickly and with low error rates. GenomePeek uses a sequence assembly approach where reads to a set of conserved genes are extracted, assembled and then aligned against the highly specific reference database. GenomePeek was found to be faster than traditional approaches while still keeping errormore » rates low, as well as offering unique data visualization options.« less

  16. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  17. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  18. Green Infrastructure Modeling Toolkit

    EPA Pesticide Factsheets

    EPA's Green Infrastructure Modeling Toolkit is a toolkit of 5 EPA green infrastructure models and tools, along with communication materials, that can be used as a teaching tool and a quick reference resource when making GI implementation decisions.

  19. Virtual Character Animation Based on Affordable Motion Capture and Reconfigurable Tangible Interfaces.

    PubMed

    Lamberti, Fabrizio; Paravati, Gianluca; Gatteschi, Valentina; Cannavo, Alberto; Montuschi, Paolo

    2018-05-01

    Software for computer animation is generally characterized by a steep learning curve, due to the entanglement of both sophisticated techniques and interaction methods required to control 3D geometries. This paper proposes a tool designed to support computer animation production processes by leveraging the affordances offered by articulated tangible user interfaces and motion capture retargeting solutions. To this aim, orientations of an instrumented prop are recorded together with animator's motion in the 3D space and used to quickly pose characters in the virtual environment. High-level functionalities of the animation software are made accessible via a speech interface, thus letting the user control the animation pipeline via voice commands while focusing on his or her hands and body motion. The proposed solution exploits both off-the-shelf hardware components (like the Lego Mindstorms EV3 bricks and the Microsoft Kinect, used for building the tangible device and tracking animator's skeleton) and free open-source software (like the Blender animation tool), thus representing an interesting solution also for beginners approaching the world of digital animation for the first time. Experimental results in different usage scenarios show the benefits offered by the designed interaction strategy with respect to a mouse & keyboard-based interface both for expert and non-expert users.

  20. Input-output model for MACCS nuclear accident impacts estimation¹

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less

  1. Metavisitor, a Suite of Galaxy Tools for Simple and Rapid Detection and Discovery of Viruses in Deep Sequence Data

    PubMed Central

    Vernick, Kenneth D.

    2017-01-01

    Metavisitor is a software package that allows biologists and clinicians without specialized bioinformatics expertise to detect and assemble viral genomes from deep sequence datasets. The package is composed of a set of modular bioinformatic tools and workflows that are implemented in the Galaxy framework. Using the graphical Galaxy workflow editor, users with minimal computational skills can use existing Metavisitor workflows or adapt them to suit specific needs by adding or modifying analysis modules. Metavisitor works with DNA, RNA or small RNA sequencing data over a range of read lengths and can use a combination of de novo and guided approaches to assemble genomes from sequencing reads. We show that the software has the potential for quick diagnosis as well as discovery of viruses from a vast array of organisms. Importantly, we provide here executable Metavisitor use cases, which increase the accessibility and transparency of the software, ultimately enabling biologists or clinicians to focus on biological or medical questions. PMID:28045932

  2. Phyx: phylogenetic tools for unix.

    PubMed

    Brown, Joseph W; Walker, Joseph F; Smith, Stephen A

    2017-06-15

    The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx : a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx. eebsmith@umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  3. Phyx: phylogenetic tools for unix

    PubMed Central

    Brown, Joseph W.; Walker, Joseph F.; Smith, Stephen A.

    2017-01-01

    Abstract Summary: The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx: a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. Availability and Implementation: phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx Contact: eebsmith@umich.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28174903

  4. The t-test: An Influential Inferential Tool in Chaplaincy and Other Healthcare Research.

    PubMed

    Jankowski, Katherine R B; Flannelly, Kevin J; Flannelly, Laura T

    2018-01-01

    The t-test developed by William S. Gosset (also known as Student's t-test and the two-sample t-test) is commonly used to compare one sample mean on a measure with another sample mean on the same measure. The outcome of the t-test is used to draw inferences about how different the samples are from each other. It is probably one of the most frequently relied upon statistics in inferential research. It is easy to use: a researcher can calculate the statistic with three simple tools: paper, pen, and a calculator. A computer program can quickly calculate the t-test for large samples. The ease of use can result in the misuse of the t-test. This article discusses the development of the original t-test, basic principles of the t-test, two additional types of t-tests (the one-sample t-test and the paired t-test), and recommendations about what to consider when using the t-test to draw inferences in research.

  5. Electronic Handbooks Simplify Process Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.

  6. Chromosome surveys of human populations: between epidemiology and anthropology.

    PubMed

    de Chadarevian, Soraya

    2014-09-01

    It is commonly held that after 1945 human genetics turned medical and focussed on the individual rather than on the study of human populations that had become discredited. However, a closer look at the research practices at the time quickly reveals that human population studies, using old and new tools, prospered in this period. The essay focuses on the rise of chromosome analysis as a new tool for the study of human populations. It reviews a broad array of population studies ranging from newborn screening programmes to studies of isolated or 'primitive' people. Throughout, it highlights the continuing role of concerns and opportunities raised by the propagation of atomic energy for civilian and military uses, the collection of large data bases and computers, and the role of international organisations like the World Health Organisation and the International Biological Programme in shaping research agendas and carving out a space for human heredity in the postwar era. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Engineering visualization utilizing advanced animation

    NASA Technical Reports Server (NTRS)

    Sabionski, Gunter R.; Robinson, Thomas L., Jr.

    1989-01-01

    Engineering visualization is the use of computer graphics to depict engineering analysis and simulation in visual form from project planning through documentation. Graphics displays let engineers see data represented dynamically which permits the quick evaluation of results. The current state of graphics hardware and software generally allows the creation of two types of 3D graphics. The use of animated video as an engineering visualization tool is presented. The engineering, animation, and videography aspects of animated video production are each discussed. Specific issues include the integration of staffing expertise, hardware, software, and the various production processes. A detailed explanation of the animation process reveals the capabilities of this unique engineering visualization method. Automation of animation and video production processes are covered and future directions are proposed.

  8. Methods for planning a statistical POD study

    NASA Astrophysics Data System (ADS)

    Koh, Y.-M.; Meeker, W. Q.

    2013-01-01

    The most common question asked of a statistician is "How large should my sample be?" In NDE applications, the most common questions asked of a statistician are "How many specimens do I need and what should be the distribution of flaw sizes?" Although some useful general guidelines exist (e.g. in MIK-HDBK-1823) it is possible to use statistical tools to provide more definitive guidelines and to allow comparison among different proposed study plans. One can assess the performance of a proposed POD study plan by obtaining computable expressions for estimation precision. This allows for a quick and easy assessment of tradeoffs and comparison of various alternative plans. We use a signal-response dataset obtained from MIK-HDBK-1823 to illustrate the ideas.

  9. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  10. Science & Technology Review September/October 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bearinger, J P

    2008-07-21

    This issue has the following articles: (1) Answering Scientists Most Audacious Questions--Commentary by Dona Crawford; (2) Testing the Accuracy of the Supernova Yardstick--High-resolution simulations are advancing understanding of Type Ia supernovae to help uncover the mysteries of dark energy; (3) Developing New Drugs and Personalized Medical Treatment--Accelerator mass spectrometry is emerging as an essential tool for assessing the effects of drugs in humans; (4) Triage in a Patch--A painless skin patch and accompanying detector can quickly indicate human exposure to biological pathogens, chemicals, explosives, or radiation; and (5) Smoothing Out Defects for Extreme Ultraviolet Lithography--A process for smoothing mask defectsmore » helps move extreme ultraviolet lithography one step closer to creating smaller, more powerful computer chips.« less

  11. Large-scale linear rankSVM.

    PubMed

    Lee, Ching-Pei; Lin, Chih-Jen

    2014-04-01

    Linear rankSVM is one of the widely used methods for learning to rank. Although its performance may be inferior to nonlinear methods such as kernel rankSVM and gradient boosting decision trees, linear rankSVM is useful to quickly produce a baseline model. Furthermore, following its recent development for classification, linear rankSVM may give competitive performance for large and sparse data. A great deal of works have studied linear rankSVM. The focus is on the computational efficiency when the number of preference pairs is large. In this letter, we systematically study existing works, discuss their advantages and disadvantages, and propose an efficient algorithm. We discuss different implementation issues and extensions with detailed experiments. Finally, we develop a robust linear rankSVM tool for public use.

  12. NMESys: An expert system for network fault detection

    NASA Technical Reports Server (NTRS)

    Nelson, Peter C.; Warpinski, Janet

    1991-01-01

    The problem of network management is becoming an increasingly difficult and challenging task. It is very common today to find heterogeneous networks consisting of many different types of computers, operating systems, and protocols. The complexity of implementing a network with this many components is difficult enough, while the maintenance of such a network is an even larger problem. A prototype network management expert system, NMESys, implemented in the C Language Integrated Production System (CLIPS). NMESys concentrates on solving some of the critical problems encountered in managing a large network. The major goal of NMESys is to provide a network operator with an expert system tool to quickly and accurately detect hard failures, potential failures, and to minimize or eliminate user down time in a large network.

  13. How to Run FAST Simulations.

    PubMed

    Zimmerman, M I; Bowman, G R

    2016-01-01

    Molecular dynamics (MD) simulations are a powerful tool for understanding enzymes' structures and functions with full atomistic detail. These physics-based simulations model the dynamics of a protein in solution and store snapshots of its atomic coordinates at discrete time intervals. Analysis of the snapshots from these trajectories provides thermodynamic and kinetic properties such as conformational free energies, binding free energies, and transition times. Unfortunately, simulating biologically relevant timescales with brute force MD simulations requires enormous computing resources. In this chapter we detail a goal-oriented sampling algorithm, called fluctuation amplification of specific traits, that quickly generates pertinent thermodynamic and kinetic information by using an iterative series of short MD simulations to explore the vast depths of conformational space. © 2016 Elsevier Inc. All rights reserved.

  14. Remote sensing entropy to assess the sustainability of rainfall in tropical catchment

    NASA Astrophysics Data System (ADS)

    Mahmud, M. R.; Reba, M. N. M.; Wei, J. S.; Razak, N. H. Abdul

    2018-02-01

    This study demonstrated the utility of entropy computation using the satellite precipitation remote sensing data to assess the sustainability of rainfall in tropical catchments. There were two major issues need to be anticipated in monitoring the tropical catchments; first is the frequent monitoring of the rainfall and second is the appropriate indicator that sensitive to rainfall pattern changes or disorder. For the first issue, the use of satellite remote sensing precipitation data is suggested. Meanwhile for the second issue, the utilization of entropy concept in interpreting the disorder of temporal rainfall can be used to assess the sustain ability had been successfully adopted in some studies. Therefore, we hypothesized that the use of satellite precipitation as main data to compute entropy can be a novel tool in anticipating the above-mentioned conflict earlier. The remote sensing entropy results and in-situ river level showed good agreement indicating its reliability. 72% of the catchment has moderate to good rainfall supply during normal or non-drought condition. However, our result showed that the catchments were highly sensitive to drought especially in the west coast and southern part of the Peninsular Malaysia. High resiliency was identified in the east coast. We summarized that the proposed entropy-quantity scheme was a useful tool for cost-effective, quick, and operational sustainability assessment This study demonstrated the utility of entropy computation using the satellite precipitation remote sensing data to assess the sustainability of rainfall in tropical catchments.

  15. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  16. Virtual Safety Training.

    ERIC Educational Resources Information Center

    Fuller, Scott; Davis, Jason

    2003-01-01

    The Multimedia Tool Box Talk is a web-based quick reference safety guide and training tool for construction personnel. An intended outcome of this effort was to provide an efficient and effective way to locate and interpret crucial safety information while at the job site. The tool includes information from the Occupational Safety and Health…

  17. Multipurpose Scribing and Drawing Tool

    NASA Technical Reports Server (NTRS)

    Ellis, J. M.

    1986-01-01

    Two-part tool reconfigured for variety of jobs. Tool performs several functions useful in layout. Lines, curves, and angles made visible as either bright scribe marks or as dark pencil (or ink) marks. Multipurpose tool speeds up laying out of patterns on sheet metal, wood, plastic, or paper. Tool is carried in pocket, then quickly assembled for service as height gauge, pair of dividers, protractor, surface gauge, or square.

  18. Biomolecularmodeling and simulation: a field coming of age

    PubMed Central

    Schlick, Tamar; Collepardo-Guevara, Rosana; Halvorsen, Leif Arthur; Jung, Segun; Xiao, Xia

    2013-01-01

    We assess the progress in biomolecular modeling and simulation, focusing on structure prediction and dynamics, by presenting the field’s history, metrics for its rise in popularity, early expressed expectations, and current significant applications. The increases in computational power combined with improvements in algorithms and force fields have led to considerable success, especially in protein folding, specificity of ligand/biomolecule interactions, and interpretation of complex experimental phenomena (e.g. NMR relaxation, protein-folding kinetics and multiple conformational states) through the generation of structural hypotheses and pathway mechanisms. Although far from a general automated tool, structure prediction is notable for proteins and RNA that preceded the experiment, especially by knowledge-based approaches. Thus, despite early unrealistic expectations and the realization that computer technology alone will not quickly bridge the gap between experimental and theoretical time frames, ongoing improvements to enhance the accuracy and scope of modeling and simulation are propelling the field onto a productive trajectory to become full partner with experiment and a field on its own right. PMID:21226976

  19. Satellite Imagery Analysis for Automated Global Food Security Forecasting

    NASA Astrophysics Data System (ADS)

    Moody, D.; Brumby, S. P.; Chartrand, R.; Keisler, R.; Mathis, M.; Beneke, C. M.; Nicholaeff, D.; Skillman, S.; Warren, M. S.; Poehnelt, J.

    2017-12-01

    The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes/year of daily high-resolution global coverage imagery. Cloud computing and storage, combined with recent advances in machine learning, are enabling understanding of the world at a scale and at a level of detail never before feasible. We present results from an ongoing effort to develop satellite imagery analysis tools that aggregate temporal, spatial, and spectral information and that can scale with the high-rate and dimensionality of imagery being collected. We focus on the problem of monitoring food crop productivity across the Middle East and North Africa, and show how an analysis-ready, multi-sensor data platform enables quick prototyping of satellite imagery analysis algorithms, from land use/land cover classification and natural resource mapping, to yearly and monthly vegetative health change trends at the structural field level.

  20. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  1. New Python-based methods for data processing

    PubMed Central

    Sauter, Nicholas K.; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel

    2013-01-01

    Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h−1) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femto­second crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units. PMID:23793153

  2. Boundary methods for mode estimation

    NASA Astrophysics Data System (ADS)

    Pierson, William E., Jr.; Ulug, Batuhan; Ahalt, Stanley C.

    1999-08-01

    This paper investigates the use of Boundary Methods (BMs), a collection of tools used for distribution analysis, as a method for estimating the number of modes associated with a given data set. Model order information of this type is required by several pattern recognition applications. The BM technique provides a novel approach to this parameter estimation problem and is comparable in terms of both accuracy and computations to other popular mode estimation techniques currently found in the literature and automatic target recognition applications. This paper explains the methodology used in the BM approach to mode estimation. Also, this paper quickly reviews other common mode estimation techniques and describes the empirical investigation used to explore the relationship of the BM technique to other mode estimation techniques. Specifically, the accuracy and computational efficiency of the BM technique are compared quantitatively to the a mixture of Gaussian (MOG) approach and a k-means approach to model order estimation. The stopping criteria of the MOG and k-means techniques is the Akaike Information Criteria (AIC).

  3. Computer-Assisted Decision Support for Student Admissions Based on Their Predicted Academic Performance.

    PubMed

    Muratov, Eugene; Lewis, Margaret; Fourches, Denis; Tropsha, Alexander; Cox, Wendy C

    2017-04-01

    Objective. To develop predictive computational models forecasting the academic performance of students in the didactic-rich portion of a doctor of pharmacy (PharmD) curriculum as admission-assisting tools. Methods. All PharmD candidates over three admission cycles were divided into two groups: those who completed the PharmD program with a GPA ≥ 3; and the remaining candidates. Random Forest machine learning technique was used to develop a binary classification model based on 11 pre-admission parameters. Results. Robust and externally predictive models were developed that had particularly high overall accuracy of 77% for candidates with high or low academic performance. These multivariate models were highly accurate in predicting these groups to those obtained using undergraduate GPA and composite PCAT scores only. Conclusion. The models developed in this study can be used to improve the admission process as preliminary filters and thus quickly identify candidates who are likely to be successful in the PharmD curriculum.

  4. Leg CT scan

    MedlinePlus

    CAT scan - leg; Computed axial tomography scan - leg; Computed tomography scan - leg; CT scan - leg ... CT scan makes detailed pictures of the body very quickly. The test may help look for: An abscess ...

  5. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    PubMed

    Cole, Brian S; Moore, Jason H

    2018-03-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  6. Eleven quick tips for architecting biomedical informatics workflows with cloud computing

    PubMed Central

    Moore, Jason H.

    2018-01-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416

  7. New Abstraction Networks and a New Visualization Tool in Support of Auditing the SNOMED CT Content

    PubMed Central

    Geller, James; Ochs, Christopher; Perl, Yehoshua; Xu, Junchuan

    2012-01-01

    Medical terminologies are large and complex. Frequently, errors are hidden in this complexity. Our objective is to find such errors, which can be aided by deriving abstraction networks from a large terminology. Abstraction networks preserve important features but eliminate many minor details, which are often not useful for identifying errors. Providing visualizations for such abstraction networks aids auditors by allowing them to quickly focus on elements of interest within a terminology. Previously we introduced area taxonomies and partial area taxonomies for SNOMED CT. In this paper, two advanced, novel kinds of abstraction networks, the relationship-constrained partial area subtaxonomy and the root-constrained partial area subtaxonomy are defined and their benefits are demonstrated. We also describe BLUSNO, an innovative software tool for quickly generating and visualizing these SNOMED CT abstraction networks. BLUSNO is a dynamic, interactive system that provides quick access to well organized information about SNOMED CT. PMID:23304293

  8. New abstraction networks and a new visualization tool in support of auditing the SNOMED CT content.

    PubMed

    Geller, James; Ochs, Christopher; Perl, Yehoshua; Xu, Junchuan

    2012-01-01

    Medical terminologies are large and complex. Frequently, errors are hidden in this complexity. Our objective is to find such errors, which can be aided by deriving abstraction networks from a large terminology. Abstraction networks preserve important features but eliminate many minor details, which are often not useful for identifying errors. Providing visualizations for such abstraction networks aids auditors by allowing them to quickly focus on elements of interest within a terminology. Previously we introduced area taxonomies and partial area taxonomies for SNOMED CT. In this paper, two advanced, novel kinds of abstraction networks, the relationship-constrained partial area subtaxonomy and the root-constrained partial area subtaxonomy are defined and their benefits are demonstrated. We also describe BLUSNO, an innovative software tool for quickly generating and visualizing these SNOMED CT abstraction networks. BLUSNO is a dynamic, interactive system that provides quick access to well organized information about SNOMED CT.

  9. Design Curve Generation for 3D SiC Fiber Architecture

    NASA Technical Reports Server (NTRS)

    Lang, Jerry; Dicarlo, James A.

    2014-01-01

    The design tool provides design curves that allow a simple and quick way to examine multiple factors that can influence the processing and key properties of the preforms and their final SiC-reinforced ceramic composites without over obligating financial capital for the fabricating of materials. Tool predictions for process and fiber fraction properties have been validated for a HNS 3D preform.The virtualization aspect of the tool will be used to provide a quick generation of solid models with actual fiber paths for finite element evaluation to predict mechanical and thermal properties of proposed composites as well as mechanical displacement behavior due to creep and stress relaxation to study load sharing characteristic between constitutes for better performance.Tool predictions for the fiber controlled properties of the SiCSiC CMC fabricated from the HNS preforms will be valuated and up-graded from the measurements on these CMC

  10. The Role of Motor Learning in Spatial Adaptation near a Tool

    PubMed Central

    Brown, Liana E.; Doole, Robert; Malfait, Nicole

    2011-01-01

    Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented. PMID:22174944

  11. A history of the INTERNIST-1 and Quick Medical Reference (QMR) computer-assisted diagnosis projects, with lessons learned.

    PubMed

    Miller, R A

    2010-01-01

    The INTERNIST-1/Quick Medical Reference (QMR) diagnostic decision support project spans four decades, from 1971-onward. This paper describes the history of the project and details insights gained of relevance to the general clinical and informatics communities.

  12. Assessment of phantom dosimetry and image quality of i-CAT FLX cone-beam computed tomography.

    PubMed

    Ludlow, John B; Walker, Cameron

    2013-12-01

    The increasing use of cone-beam computed tomography in orthodontics has been coupled with heightened concern about the long-term risks of x-ray exposure in orthodontic populations. An industry response to this has been to offer low-exposure alternative scanning options in newer cone-beam computed tomography models. Effective doses resulting from various combinations of field of view size and field location comparing child and adult anthropomorphic phantoms with the recently introduced i-CAT FLX cone-beam computed tomography unit (Imaging Sciences, Hatfield, Pa) were measured with optical stimulated dosimetry using previously validated protocols. Scan protocols included high resolution (360° rotation, 600 image frames, 120 kV[p], 5 mA, 7.4 seconds), standard (360°, 300 frames, 120 kV[p], 5 mA, 3.7 seconds), QuickScan (180°, 160 frames, 120 kV[p], 5 mA, 2 seconds), and QuickScan+ (180°, 160 frames, 90 kV[p], 3 mA, 2 seconds). Contrast-to-noise ratio was calculated as a quantitative measure of image quality for the various exposure options using the QUART DVT phantom. Child phantom doses were on average 36% greater than adult phantom doses. QuickScan+ protocols resulted in significantly lower doses than standard protocols for the child (P = 0.0167) and adult (P = 0.0055) phantoms. The 13 × 16-cm cephalometric fields of view ranged from 11 to 85 μSv in the adult phantom and 18 to 120 μSv in the child phantom for the QuickScan+ and standard protocols, respectively. The contrast-to-noise ratio was reduced by approximately two thirds when comparing QuickScan+ with standard exposure parameters. QuickScan+ effective doses are comparable with conventional panoramic examinations. Significant dose reductions are accompanied by significant reductions in image quality. However, this trade-off might be acceptable for certain diagnostic tasks such as interim assessment of treatment results. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  13. Quick and Easy: Use Screen Capture Software to Train and Communicate

    ERIC Educational Resources Information Center

    Schuster, Ellen

    2011-01-01

    Screen capture (screen cast) software can be used to develop short videos for training purposes. Developing videos is quick and easy. This article describes how these videos are used as tools to reinforce face-to-face and interactive TV curriculum training in a nutrition education program. Advantages of developing these videos are shared.…

  14. Quick Reads: Using Ipads to Explore Parabolas

    ERIC Educational Resources Information Center

    Bixler, Sharon G.

    2014-01-01

    An iPad® can be used to teach students to graph parabolas with ease and grasp vocabulary quickly. Parabolas come to life for students in this easily implemented activity described in this article. Teachers can use this tool in a fun and interactive way to not only address these graphing and vocabulary concepts but also introduce and explore…

  15. Web-based interactive 3D visualization as a tool for improved anatomy learning.

    PubMed

    Petersson, Helge; Sinkvist, David; Wang, Chunliang; Smedby, Orjan

    2009-01-01

    Despite a long tradition, conventional anatomy education based on dissection is declining. This study tested a new virtual reality (VR) technique for anatomy learning based on virtual contrast injection. The aim was to assess whether students value this new three-dimensional (3D) visualization method as a learning tool and what value they gain from its use in reaching their anatomical learning objectives. Several 3D vascular VR models were created using an interactive segmentation tool based on the "virtual contrast injection" method. This method allows users, with relative ease, to convert computer tomography or magnetic resonance images into vivid 3D VR movies using the OsiriX software equipped with the CMIV CTA plug-in. Once created using the segmentation tool, the image series were exported in Quick Time Virtual Reality (QTVR) format and integrated within a web framework of the Educational Virtual Anatomy (EVA) program. A total of nine QTVR movies were produced encompassing most of the major arteries of the body. These movies were supplemented with associated information, color keys, and notes. The results indicate that, in general, students' attitudes towards the EVA-program were positive when compared with anatomy textbooks, but results were not the same with dissections. Additionally, knowledge tests suggest a potentially beneficial effect on learning.

  16. AIRNOISE: A Tool for Preliminary Noise-Abatement Terminal Approach Route Design

    NASA Technical Reports Server (NTRS)

    Li, Jinhua; Sridhar, Banavar; Xue, Min; Ng, Hok

    2016-01-01

    Noise from aircraft in the airport vicinity is one of the leading aviation-induced environmental issues. The FAA developed the Integrated Noise Model (INM) and its replacement Aviation Environmental Design Tool (AEDT) software to assess noise impact resulting from all aviation activities. However, a software tool is needed that is simple to use for terminal route modification, quick and reasonably accurate for preliminary noise impact evaluation and flexible to be used for iterative design of optimal noise-abatement terminal routes. In this paper, we extend our previous work on developing a noise-abatement terminal approach route design tool, named AIRNOISE, to satisfy this criterion. First, software efficiency has been significantly increased by over tenfold using the C programming language instead of MATLAB. Moreover, a state-of-the-art high performance GPU-accelerated computing module is implemented that was tested to be hundreds time faster than the C implementation. Secondly, a Graphical User Interface (GUI) was developed allowing users to import current terminal approach routes and modify the routes interactively to design new terminal approach routes. The corresponding noise impacts are then calculated and displayed in the GUI in seconds. Finally, AIRNOISE was applied to Baltimore-Washington International Airport terminal approach route to demonstrate its usage.

  17. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  18. Using Internet Based Paraphrasing Tools: Original Work, Patchwriting or Facilitated Plagiarism?

    ERIC Educational Resources Information Center

    Rogerson, Ann M.; McCarthy, Grace

    2017-01-01

    A casual comment by a student alerted the authors to the existence and prevalence of Internet-based paraphrasing tools. A subsequent quick Google search highlighted the broad range and availability of online paraphrasing tools which offer free 'services' to paraphrase large sections of text ranging from sentences, paragraphs, whole articles, book…

  19. Survey Tools for Faculty to Quickly Assess Multidisciplinary Team Dynamics in Capstone Courses

    ERIC Educational Resources Information Center

    Solnosky, Ryan; Fairchild, Joshua

    2017-01-01

    Many engineering faculty have limited skills and/or assessment tools to evaluate team dynamics in multidisciplinary team-based capstone courses. Rapidly deployable tools are needed here to provide proactive feedback to teams to facilitate deeper learning. Two surveys were developed based on industrial and organizational psychology theories around…

  20. An Online Authoring Tool for Creating Activity-Based Learning Objects

    ERIC Educational Resources Information Center

    Ahn, Jeong Yong; Mun, Gil Seong; Han, Kyung Soo; Choi, Sook Hee

    2017-01-01

    As higher education increasingly relies on e-learning, the need for tools that will allow teachers themselves to develop effective e-learning objects as simply and quickly as possible has also been increasingly recognized. This article discusses the design and development of a novel tool, Enook (Evolutionary note book), for creating activity-based…

  1. EBOOK.EXE: A Desktop Authoring Tool for HURAA.

    ERIC Educational Resources Information Center

    Hu, Xiangen; Mathews, Eric; Graesser, Arthur C.; Susarla, Suresh

    The development of authoring tools for intelligent systems is an important step for creating, maintaining, and structuring content in a quick and easy manner. It has the benefit of allowing for a rapid change to new domains or topics for tutoring. The development of such tools requires functional control, access protection, ease of learning, and…

  2. 78 FR 68459 - Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ... Radiological Health (CDRH) for qualification of medical device development tools (MDDT) for use in device.... Background The draft guidance describes the framework and process for the voluntary CDRH qualification of... science; and (4) more quickly and more clearly communicate with CDRH stakeholders about important advances...

  3. Developpement et evaluation d'un environnement informatise d'apprentissage pour faciliter l'integration des sciences et de la technologie

    NASA Astrophysics Data System (ADS)

    Saliba, Marie-Therese

    2011-12-01

    Through this research we will fully assess the benefits brought by the ExAO (Computer Assisted Experimentation) in school laboratories of science and technology in Lebanon. We would also like to mention its contribution in a tangible way in laboratory research of Pedagogic Robotic from Montreal University, particularly in the development of ExAO mulaboratory. We wanted to test the capabilities of the ExAO, its use in the classroom such as: 1. A replacement of a traditional laboratory in the use of the experimental method. 2. A scientific investigation tool. 3. An integration tool of experimental sciences and mathematics. 4. An integration tool of experimental sciences, mathematics and technology in the technoscientific learning. To do so, we have mobilized 13 group classes, designated teachers to experiment themselves along with their students in order to assess, in a more realistic way, the benefits of implementing this micro computer laboratory at school. Different testing, evaluated using the results of learning activities undertaken by students, their responses to a questionnaire and feedback from teachers, show that: 1. The replacement of a traditional laboratory with an ExAO mulaboratory does not seem to pose problem, expected that students have adapted to it in only ten minutes, indicating that the speed with which data were graphed was more productive. 2. In order to investigate a physical phenomenon, the usability of the tutorial associated with the ability to amplify the phenomenon before its graph representation, has allowed students to design and implement quickly and independently an experiment to verify their prediction. 3. The integration of mathematics into an experimental approach can quickly grasp the phenomenon. In addition, it gives more autonomy and a meaning to the graphs and algebraic representations allowing to use them as a cognitive tool to interpret this phenomenon. 4. The approach made by the students to design and construct a technological object, showed that this activity was easily carried out by the use of universal sensors, amplifiers to offset the graphical modeling tool, and the tutorial ability to transform any measured variable by another variable (for instance, the resistance variation in temperature change, ...). This educational activity shows that students had no difficulty integrating in a single learning activity the mathematics, experimental sciences and technology, in order to design and implement a functional piece of technology. The ExAO mulaboratory, by offering new educational opportunities, such as the ability to design, produce and validate a technological object, in order to do so, new capacities to boost measures, modeling physical phenomena, developing new sensors, is an important addition to the experiments being conducted in ExAO. Keywords: ExAO, teaching, integration, Lebanese schools.

  4. Determining GPS average performance metrics

    NASA Technical Reports Server (NTRS)

    Moore, G. V.

    1995-01-01

    Analytic and semi-analytic methods are used to show that users of the GPS constellation can expect performance variations based on their location. Specifically, performance is shown to be a function of both altitude and latitude. These results stem from the fact that the GPS constellation is itself non-uniform. For example, GPS satellites are over four times as likely to be directly over Tierra del Fuego than over Hawaii or Singapore. Inevitable performance variations due to user location occur for ground, sea, air and space GPS users. These performance variations can be studied in an average relative sense. A semi-analytic tool which symmetrically allocates GPS satellite latitude belt dwell times among longitude points is used to compute average performance metrics. These metrics include average number of GPS vehicles visible, relative average accuracies in the radial, intrack and crosstrack (or radial, north/south, east/west) directions, and relative average PDOP or GDOP. The tool can be quickly changed to incorporate various user antenna obscuration models and various GPS constellation designs. Among other applications, tool results can be used in studies to: predict locations and geometries of best/worst case performance, design GPS constellations, determine optimal user antenna location and understand performance trends among various users.

  5. Development of a ROV Deployed Video Analysis Tool for Rapid Measurement of Submerged Oil/Gas Leaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savas, Omer

    Expanded deep sea drilling around the globe makes it necessary to have readily available tools to quickly and accurately measure discharge rates from accidental submerged oil/gas leak jets for the first responders to deploy adequate resources for containment. We have developed and tested a field deployable video analysis software package which is able to provide in the field sufficiently accurate flow rate estimates for initial responders in accidental oil discharges in submarine operations. The essence of our approach is based on tracking coherent features at the interface in the near field of immiscible turbulent jets. The software package, UCB_Plume, ismore » ready to be used by the first responders for field implementation. We have tested the tool on submerged water and oil jets which are made visible using fluorescent dyes. We have been able to estimate the discharge rate within 20% accuracy. A high end WINDOWS laptop computer is suggested as the operating platform and a USB connected high speed, high resolution monochrome camera as the imaging device are sufficient for acquiring flow images under continuous unidirectional illumination and running the software in the field. Results are obtained over a matter of minutes.« less

  6. CAD/CAM/CNC.

    ERIC Educational Resources Information Center

    Domermuth, Dave; And Others

    1996-01-01

    Includes "Quick Start CNC (computer numerical control) with a Vacuum Filter and Laminated Plastic" (Domermuth); "School and Industry Cooperate for Mutual Benefit" (Buckler); and "CAD (computer-assisted drafting) Careers--What Professionals Have to Say" (Skinner). (JOW)

  7. BIM-Based Timber Structures Refurbishment of the Immovable Heritage Listed Buildings

    NASA Astrophysics Data System (ADS)

    Henek, Vladan; Venkrbec, Václav

    2017-12-01

    The use of Building information model (BIM) design tools is no longer an exception, but a common issue. When designing new buildings or complex renovations using BIM, the benefits have already been repeatedly published. The essence of BIM is to create a multidimensional geometric model of a planned building electronically on a computer, supplemented with the necessary information in advance of the construction process. Refurbishment is a specific process that combines both - new structures and demolished structures, or structures that need to be dismantled, repaired, and then returned to the original position. Often it can be historically valuable part of the building. BIM-based repairs and refurbishments of the constructions, especially complicated repairs of the structures of roof trusses of immovable heritage listed buildings, have not yet been credibly presented. However, the use of BIM tools may be advantageous in this area, because user can quickly response to the necessary changes that may be needed during refurbishments, but also in connection with the quick assessment and cost estimation of any unexpected additional works. The paper deals with the use of BIM in the field of repairs and refurbishment of the buildings in general. The emphasis on monumentally protected elements was priority. Advantage of the proposal research is demonstrated on case study of the refurbishment of the immovable heritage listed truss roof. According to this study, this construction was realized in the Czech Republic. Case study consists of 3D modelled truss parts and the connected technological workflow base. The project work was carried out in one common model environment.

  8. QuickMap: a public tool for large-scale gene therapy vector insertion site mapping and analysis.

    PubMed

    Appelt, J-U; Giordano, F A; Ecker, M; Roeder, I; Grund, N; Hotz-Wagenblatt, A; Opelz, G; Zeller, W J; Allgayer, H; Fruehauf, S; Laufs, S

    2009-07-01

    Several events of insertional mutagenesis in pre-clinical and clinical gene therapy studies have created intense interest in assessing the genomic insertion profiles of gene therapy vectors. For the construction of such profiles, vector-flanking sequences detected by inverse PCR, linear amplification-mediated-PCR or ligation-mediated-PCR need to be mapped to the host cell's genome and compared to a reference set. Although remarkable progress has been achieved in mapping gene therapy vector insertion sites, public reference sets are lacking, as are the possibilities to quickly detect non-random patterns in experimental data. We developed a tool termed QuickMap, which uniformly maps and analyzes human and murine vector-flanking sequences within seconds (available at www.gtsg.org). Besides information about hits in chromosomes and fragile sites, QuickMap automatically determines insertion frequencies in +/- 250 kb adjacency to genes, cancer genes, pseudogenes, transcription factor and (post-transcriptional) miRNA binding sites, CpG islands and repetitive elements (short interspersed nuclear elements (SINE), long interspersed nuclear elements (LINE), Type II elements and LTR elements). Additionally, all experimental frequencies are compared with the data obtained from a reference set, containing 1 000 000 random integrations ('random set'). Thus, for the first time a tool allowing high-throughput profiling of gene therapy vector insertion sites is available. It provides a basis for large-scale insertion site analyses, which is now urgently needed to discover novel gene therapy vectors with 'safe' insertion profiles.

  9. A web-based data visualization tool for the MIMIC-II database.

    PubMed

    Lee, Joon; Ribey, Evan; Wallace, James R

    2016-02-04

    Although MIMIC-II, a public intensive care database, has been recognized as an invaluable resource for many medical researchers worldwide, becoming a proficient MIMIC-II researcher requires knowledge of SQL programming and an understanding of the MIMIC-II database schema. These are challenging requirements especially for health researchers and clinicians who may have limited computer proficiency. In order to overcome this challenge, our objective was to create an interactive, web-based MIMIC-II data visualization tool that first-time MIMIC-II users can easily use to explore the database. The tool offers two main features: Explore and Compare. The Explore feature enables the user to select a patient cohort within MIMIC-II and visualize the distributions of various administrative, demographic, and clinical variables within the selected cohort. The Compare feature enables the user to select two patient cohorts and visually compare them with respect to a variety of variables. The tool is also helpful to experienced MIMIC-II researchers who can use it to substantially accelerate the cumbersome and time-consuming steps of writing SQL queries and manually visualizing extracted data. Any interested researcher can use the MIMIC-II data visualization tool for free to quickly and conveniently conduct a preliminary investigation on MIMIC-II with a few mouse clicks. Researchers can also use the tool to learn the characteristics of the MIMIC-II patients. Since it is still impossible to conduct multivariable regression inside the tool, future work includes adding analytics capabilities. Also, the next version of the tool will aim to utilize MIMIC-III which contains more data.

  10. A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data

    NASA Astrophysics Data System (ADS)

    Li, Z.; Hodgson, M.; Li, W.

    2016-12-01

    Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.

  11. WIND VELOCITIES AND SAND FLUXES IN MESQUITE DUNE-LANDS IN THE NORTHERN CHIHUAHUAN DESERT: A COMPARISON BETWEEN FIELD MEASUREMENTS AND THE QUIC (QUICK URBAN AND INDUSTRIAL COMPLEX) MODEL

    EPA Science Inventory

    The poster shows comparisons of wind velocities and sand fluxes between field measurements and a computer model, called QUIC (Quick Urban & Industrial Complex). The comparisons were made for a small desert region in New Mexico.

  12. Evaluation of work zone enhancement software programs.

    DOT National Transportation Integrated Search

    2009-09-01

    The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...

  13. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    PubMed

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI QuickScan intervention programme did not prove to be cost-effective from the both the societal and companies' perspective and, therefore, this study does not provide a financial reason for implementing this intervention. However, with a relatively small investment, the programme did increase the number of workers who received information on healthy computer use and improved their work posture and movement. NTR1117.

  14. Comparison of Computational Approaches for Rapid Aerodynamic Assessment of Small UAVs

    NASA Technical Reports Server (NTRS)

    Shafer, Theresa C.; Lynch, C. Eric; Viken, Sally A.; Favaregh, Noah; Zeune, Cale; Williams, Nathan; Dansie, Jonathan

    2014-01-01

    Computational Fluid Dynamic (CFD) methods were used to determine the basic aerodynamic, performance, and stability and control characteristics of the unmanned air vehicle (UAV), Kahu. Accurate and timely prediction of the aerodynamic characteristics of small UAVs is an essential part of military system acquisition and air-worthiness evaluations. The forces and moments of the UAV were predicted using a variety of analytical methods for a range of configurations and conditions. The methods included Navier Stokes (N-S) flow solvers (USM3D, Kestrel and Cobalt) that take days to set up and hours to converge on a single solution; potential flow methods (PMARC, LSAERO, and XFLR5) that take hours to set up and minutes to compute; empirical methods (Datcom) that involve table lookups and produce a solution quickly; and handbook calculations. A preliminary aerodynamic database can be developed very efficiently by using a combination of computational tools. The database can be generated with low-order and empirical methods in linear regions, then replacing or adjusting the data as predictions from higher order methods are obtained. A comparison of results from all the data sources as well as experimental data obtained from a wind-tunnel test will be shown and the methods will be evaluated on their utility during each portion of the flight envelope.

  15. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  16. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  17. S3DB core: a framework for RDF generation and management in bioinformatics infrastructures

    PubMed Central

    2010-01-01

    Background Biomedical research is set to greatly benefit from the use of semantic web technologies in the design of computational infrastructure. However, beyond well defined research initiatives, substantial issues of data heterogeneity, source distribution, and privacy currently stand in the way towards the personalization of Medicine. Results A computational framework for bioinformatic infrastructure was designed to deal with the heterogeneous data sources and the sensitive mixture of public and private data that characterizes the biomedical domain. This framework consists of a logical model build with semantic web tools, coupled with a Markov process that propagates user operator states. An accompanying open source prototype was developed to meet a series of applications that range from collaborative multi-institution data acquisition efforts to data analysis applications that need to quickly traverse complex data structures. This report describes the two abstractions underlying the S3DB-based infrastructure, logical and numerical, and discusses its generality beyond the immediate confines of existing implementations. Conclusions The emergence of the "web as a computer" requires a formal model for the different functionalities involved in reading and writing to it. The S3DB core model proposed was found to address the design criteria of biomedical computational infrastructure, such as those supporting large scale multi-investigator research, clinical trials, and molecular epidemiology. PMID:20646315

  18. Accuracy of a disability instrument to identify workers likely to develop upper extremity musculoskeletal disorders.

    PubMed

    Stover, Bert; Silverstein, Barbara; Wickizer, Thomas; Martin, Diane P; Kaufman, Joel

    2007-06-01

    Work related upper extremity musculoskeletal disorders (MSD) result in substantial disability, and expense. Identifying workers or jobs with high risk can trigger intervention before workers are injured or the condition worsens. We investigated a disability instrument, the QuickDASH, as a workplace screening tool to identify workers at high risk of developing upper extremity MSDs. Subjects included workers reporting recurring upper extremity MSD symptoms in the past 7 days (n = 559). The QuickDASH was reasonably accurate at baseline with sensitivity of 73% for MSD diagnosis, and 96% for symptom severity. Specificity was 56% for diagnosis, and 53% for symptom severity. At 1-year follow-up sensitivity and specificity for MSD diagnosis was 72% and 54%, respectively, as predicted by the baseline QuickDASH score. For symptom severity, sensitivity and specificity were 86% and 52%. An a priori target sensitivity of 70% and specificity of 50% was met by symptom severity, work pace and quality, and MSD diagnosis. The QuickDASH may be useful for identifying jobs or workers with increased risk for upper extremity MSDs. It may provide an efficient health surveillance screening tool useful for targeting early workplace intervention for prevention of upper extremity MSD problems.

  19. Testing simple deceptive honeypot tools

    NASA Astrophysics Data System (ADS)

    Yahyaoui, Aymen; Rowe, Neil C.

    2015-05-01

    Deception can be a useful defensive technique against cyber-attacks; it has the advantage of unexpectedness to attackers and offers a variety of tactics. Honeypots are a good tool for deception. They act as decoy computers to confuse attackers and exhaust their time and resources. This work tested the effectiveness of two free honeypot tools in real networks by varying their location and virtualization, and the effects of adding more deception to them. We tested a Web honeypot tool, Glastopf and an SSH honeypot tool Kippo. We deployed the Web honeypot in both a residential network and our organization's network and as both real and virtual machines; the organization honeypot attracted more attackers starting in the third week. Results also showed that the virtual honeypots received attacks from more unique IP addresses. They also showed that adding deception to the Web honeypot, in the form of additional linked Web pages and interactive features, generated more interest by attackers. For the purpose of comparison, we used examined log files of a legitimate Web-site www.cmand.org. The traffic distributions for the Web honeypot and the legitimate Web site showed similarities (with much malicious traffic from Brazil), but the SSH honeypot was different (with much malicious traffic from China). Contrary to previous experiments where traffic to static honeypots decreased quickly, our honeypots received increasing traffic over a period of three months. It appears that both honeypot tools are useful for providing intelligence about cyber-attack methods, and that additional deception is helpful.

  20. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  1. The PAW/GIPAW approach for computing NMR parameters: a new dimension added to NMR study of solids.

    PubMed

    Charpentier, Thibault

    2011-07-01

    In 2001, Mauri and Pickard introduced the gauge including projected augmented wave (GIPAW) method that enabled for the first time the calculation of all-electron NMR parameters in solids, i.e. accounting for periodic boundary conditions. The GIPAW method roots in the plane wave pseudopotential formalism of the density functional theory (DFT), and avoids the use of the cluster approximation. This method has undoubtedly revitalized the interest in quantum chemical calculations in the solid-state NMR community. It has quickly evolved and improved so that the calculation of the key components of NMR interactions, namely the shielding and electric field gradient tensors, has now become a routine for most of the common nuclei studied in NMR. Availability of reliable implementations in several software packages (CASTEP, Quantum Espresso, PARATEC) make its usage more and more increasingly popular, maybe indispensable in near future for all material NMR studies. The majority of nuclei of the periodic table have already been investigated by GIPAW, and because of its high accuracy it is quickly becoming an essential tool for interpreting and understanding experimental NMR spectra, providing reliable assignments of the observed resonances to crystallographic sites or enabling a priori prediction of NMR data. The continuous increase of computing power makes ever larger (and thus more realistic) systems amenable to first-principles analysis. In the near future perspectives, as the incorporation of dynamical effects and/or disorder are still at their early developments, these areas will certainly be the prime target. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. An ISVD-based Euclidian structure from motion for smartphones

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Guarnieri, A.; Vettore, A.; Pirotti, F.

    2014-06-01

    The development of Mobile Mapping systems over the last decades allowed to quickly collect georeferenced spatial measurements by means of sensors mounted on mobile vehicles. Despite the large number of applications that can potentially take advantage of such systems, because of their cost their use is currently typically limited to certain specialized organizations, companies, and Universities. However, the recent worldwide diffusion of powerful mobile devices typically embedded with GPS, Inertial Navigation System (INS), and imaging sensors is enabling the development of small and compact mobile mapping systems. More specifically, this paper considers the development of a 3D reconstruction system based on photogrammetry methods for smartphones (or other similar mobile devices). The limited computational resources available in such systems and the users' request for real time reconstructions impose very stringent requirements on the computational burden of the 3D reconstruction procedure. This work takes advantage of certain recently developed mathematical tools (incremental singular value decomposition) and of photogrammetry techniques (structure from motion, Tomasi-Kanade factorization) to access very computationally efficient Euclidian 3D reconstruction of the scene. Furthermore, thanks to the presence of instrumentation for localization embedded in the device, the obtained 3D reconstruction can be properly georeferenced.

  3. Multibody simulation of vehicles equipped with an automatic transmission

    NASA Astrophysics Data System (ADS)

    Olivier, B.; Kouroussis, G.

    2016-09-01

    Nowadays automotive vehicles remain as one of the most used modes of transportation. Furthermore automatic transmissions are increasingly used to provide a better driving comfort and a potential optimization of the engine performances (by placing the gear shifts at specific engine and vehicle speeds). This paper presents an effective modeling of the vehicle using the multibody methodology (numerically computed under EasyDyn, an open source and in-house library dedicated to multibody simulations). However, the transmission part of the vehicle is described by the usual equations of motion computed using a systematic matrix approach: del Castillo's methodology for planetary gear trains. By coupling the analytic equations of the transmission and the equations computed by the multibody methodology, the performances of any vehicle can be obtained if the characteristics of each element in the vehicle are known. The multibody methodology offers the possibilities to develop the vehicle modeling from 1D-motion to 3D-motion by taking into account the rotations and implementing tire models. The modeling presented in this paper remains very efficient and provides an easy and quick vehicle simulation tool which could be used in order to calibrate the automatic transmission.

  4. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  5. Tools for Using Citizen Science in Environmental, Agricultural, and Natural Resources Extension Programs

    ERIC Educational Resources Information Center

    Stofer, Kathryn A.

    2017-01-01

    Citizen science is quickly becoming a valuable tool in the Extension professional's tool kit. This is the case whether you are a 4-H agent looking to involve youth in agriscience and agriculture-related science, technology, engineering, and math experiential learning activities or an agriculture and natural resources agent seeking to help…

  6. minimega

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Fritz, John Floren

    2013-08-27

    Minimega is a simple emulytics platform for creating testbeds of networked devices. The platform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. Minimega attempts to allow experiments to be brought up quickly with nearly no configuration. Minimega also includes tools for simple cluster management, as well as tools for creating Linux based virtual machine images.

  7. On Recommending Web 2.0 Tools to Personalise Learning

    ERIC Educational Resources Information Center

    Juškeviciene, Anita; Kurilovas, Eugenijus

    2014-01-01

    The paper aims to present research results on using Web 2.0 tools for learning personalisation. In the work, personalised Web 2.0 tools selection method is presented. This method takes into account student's learning preferences for content and communication modes tailored to the learning activities with a view to help the learner to quickly and…

  8. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images

    NASA Astrophysics Data System (ADS)

    Erdt, Marius; Sakas, Georgios

    2010-03-01

    This work presents a novel approach for model based segmentation of the kidney in images acquired by Computed Tomography (CT). The developed computer aided segmentation system is expected to support computer aided diagnosis and operation planning. We have developed a deformable model based approach based on local shape constraints that prevents the model from deforming into neighboring structures while allowing the global shape to adapt freely to the data. Those local constraints are derived from the anatomical structure of the kidney and the presence and appearance of neighboring organs. The adaptation process is guided by a rule-based deformation logic in order to improve the robustness of the segmentation in areas of diffuse organ boundaries. Our work flow consists of two steps: 1.) a user guided positioning and 2.) an automatic model adaptation using affine and free form deformation in order to robustly extract the kidney. In cases which show pronounced pathologies, the system also offers real time mesh editing tools for a quick refinement of the segmentation result. Evaluation results based on 30 clinical cases using CT data sets show an average dice correlation coefficient of 93% compared to the ground truth. The results are therefore in most cases comparable to manual delineation. Computation times of the automatic adaptation step are lower than 6 seconds which makes the proposed system suitable for an application in clinical practice.

  9. A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1997-01-01

    This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.

  10. A Binary Array Asynchronous Sorting Algorithm with Using Petri Nets

    NASA Astrophysics Data System (ADS)

    Voevoda, A. A.; Romannikov, D. O.

    2017-01-01

    Nowadays the tasks of computations speed-up and/or their optimization are actual. Among the approaches on how to solve these tasks, a method applying approaches of parallelization and asynchronization to a sorting algorithm is considered in the paper. The sorting methods are ones of elementary methods and they are used in a huge amount of different applications. In the paper, we offer a method of an array sorting that based on a division into a set of independent adjacent pairs of numbers and their parallel and asynchronous comparison. And this one distinguishes the offered method from the traditional sorting algorithms (like quick sorting, merge sorting, insertion sorting and others). The algorithm is implemented with the use of Petri nets, like the most suitable tool for an asynchronous systems description.

  11. Association Rule Analysis for Tour Route Recommendation and Application to Wctsnop

    NASA Astrophysics Data System (ADS)

    Fang, H.; Chen, C.; Lin, J.; Liu, X.; Fang, D.

    2017-09-01

    The increasing E-tourism systems provide intelligent tour recommendation for tourists. In this sense, recommender system can make personalized suggestions and provide satisfied information associated with their tour cycle. Data mining is a proper tool that extracting potential information from large database for making strategic decisions. In the study, association rule analysis based on FP-growth algorithm is applied to find the association relationship among scenic spots in different cities as tour route recommendation. In order to figure out valuable rules, Kulczynski interestingness measure is adopted and imbalance ratio is computed. The proposed scheme was evaluated on Wangluzhe cultural tourism service network operation platform (WCTSNOP), where it could verify that it is able to quick recommend tour route and to rapidly enhance the recommendation quality.

  12. Mining Deployment Optimization

    NASA Astrophysics Data System (ADS)

    Čech, Jozef

    2016-09-01

    The deployment problem, researched primarily in the military sector, is emerging in some other industries, mining included. The principal decision is how to deploy some activities in space and time to achieve desired outcome while complying with certain requirements or limits. Requirements and limits are on the side constraints, while minimizing costs or maximizing some benefits are on the side of objectives. A model with application to mining of polymetallic deposit is presented. To obtain quick and immediate decision solutions for a mining engineer with experimental possibilities is the main intention of a computer-based tool. The task is to determine strategic deployment of mining activities on a deposit, meeting planned output from the mine and at the same time complying with limited reserves and haulage capacities. Priorities and benefits can be formulated by the planner.

  13. A Computer-Based Subduction-Zone-Earthquake Exercise for Introductory-Geology Classes.

    ERIC Educational Resources Information Center

    Shea, James Herbert

    1991-01-01

    Describes the author's computer-based program for a subduction-zone-earthquake exercise. Instructions for conducting the activity and obtaining the program from the author are provided. Written in IBM QuickBasic. (PR)

  14. Using E-Learning and ICT Courses in Educational Environment: A Review

    ERIC Educational Resources Information Center

    Salehi, Hadi; Shojaee, Mohammad; Sattar, Susan

    2015-01-01

    With the quick emergence of computers and related technology, Electronic-learning (E-learning) and Information Communication and Technology (ICT) have been extensively utilized in the education and training field. Miscellaneous methods of integrating computer technology and the context in which computers are used have affected student learning in…

  15. Waiting for the Return. Maximizing Investments in Technology.

    ERIC Educational Resources Information Center

    Workforce Economics, 1996

    1996-01-01

    Investments in technology and the number of workers using computers are growing quickly and at an increasing rate. From 1990-1995, investments in computers and related equipment tripled. Real (inflation-adjusted dollars) investments in computers and peripheral equipment increased from $200 million in 1973 to $91.6 billion in 1995. Increasing…

  16. Caries risk assessment tool and prevention protocol for public health nurses in mother and child health centers, Israel.

    PubMed

    Natapov, Lena; Dekel-Markovich, Dan; Granit-Palmon, Hadas; Aflalo, Efrat; Zusman, Shlomo Paul

    2018-01-01

    Dental caries is the most prevalent chronic disease in children. Caries risk assessment tools enable the dentists, physicians, and nondental health care providers to assess the individual's risk. Intervention by nurses in primary care settings can contribute to the establishment of oral health habits and prevention of dental disease. In Israel, Mother and Child Health Centers provide free preventive services for pregnant women and children by public health nurses. A caries prevention program in health centers started in 2015. Nurses underwent special training regarding caries prevention. A customized Caries Risk Assessment tool and Prevention Protocol for nurses, based on the AAPD tool, was introduced. A two-step evaluation was conducted which included a questionnaire and in-depth phone interviews. Twenty-eight (out of 46) health centers returned a completed questionnaire. Most nurses believed that oral health preventive services should be incorporated into their daily work. In the in-depth phone interviews, nurses stated that the integration of the program into their busy daily schedule was realistic and appropriate. The lack of specific dental module for computer program was mentioned as an implementation difficulty. The wide use of our tool by nurses supports its simplicity and feasibility which enables quick calculation and informed decision making. The nurses readily embraced the tool and it became an integral part of their toolkit. We provide public health nurses with a caries risk assessment tool and prevention protocol thus integrating oral health into general health of infants and toddlers. © 2017 Wiley Periodicals, Inc.

  17. Fast Laser Holographic Interferometry For Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Lee, George

    1989-01-01

    Proposed system makes holographic interferograms quickly in wind tunnels. Holograms reveal two-dimensional flows around airfoils and provide information on distributions of pressure, structures of wake and boundary layers, and density contours of flow fields. Holograms form quickly in thermoplastic plates in wind tunnel. Plates rigid and left in place so neither vibrations nor photgraphic-development process degrades accuracy of holograms. System processes and analyzes images quickly. Semiautomatic micro-computer-based desktop image-processing unit now undergoing development moves easily to wind tunnel, and its speed and memory adequate for flows about airfoils.

  18. Logistics Company Partner 2.0.15 Tool: Quick Start Guide, 2015 Data Year - United States Version

    EPA Pesticide Factsheets

    This EPA document provides focused guidance and worksheets for SmartWay Logistics Company Partners on how to complete the SmartWay Logistics Tool and participate in the SmartWay program. (EPA publication # EPA-420-B-16-063)

  19. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments

    PubMed Central

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-01-01

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272

  20. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  1. Combined Hydrologic (AGWA-KINEROS2) and Hydraulic (HEC2) Modeling for Post-Fire Runoff and Inundation Risk Assessment through a Set of Python Tools

    NASA Astrophysics Data System (ADS)

    Barlow, J. E.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.

    2016-12-01

    Wildfires in the Western United States can alter landscapes by removing vegetation and changing soil properties. These altered landscapes produce more runoff than pre-fire landscapes which can lead to post-fire flooding that can damage infrastructure and impair natural resources. Resources, structures, historical artifacts and others that could be impacted by increased runoff are considered values at risk. .The Automated Geospatial Watershed Assessment tool (AGWA) allows users to quickly set up and execute the Kinematic Runoff and Erosion model (KINEROS2 or K2) in the ESRI ArcMap environment. The AGWA-K2 workflow leverages the visualization capabilities of GIS to facilitate evaluation of rapid watershed assessments for post-fire planning efforts. High relative change in peak discharge, as simulated by K2, provides a visual and numeric indicator to investigate those channels in the watershed that should be evaluated for more detailed analysis, especially if values at risk are within or near that channel. Modeling inundation extent along a channel would provide more specific guidance about risk along a channel. HEC-2 and HEC-RAS can be used for hydraulic modeling efforts at the reach and river system scale. These models have been used to address flood boundaries and, accordingly, flood risk. However, data collection and organization for hydraulic models can be time consuming and therefore a combined hydrologic-hydraulic modeling approach is not often employed for rapid assessments. A simplified approach could streamline this process and provide managers with a simple workflow and tool to perform a quick risk assessment for a single reach. By focusing on a single reach highlighted by large relative change in peak discharge, data collection efforts can be minimized and the hydraulic computations can be performed to supplement risk analysis. The incorporation of hydraulic analysis through a suite of Python tools (as outlined by HEC-2) with AGWA-K2 will allow more rapid applications of combined hydrologic-hydraulic modeling. This combined modeling approach is built in the ESRI ArcGIS application to enable rapid model preparation, execution and result visualization for risk assessment in post-fire environments.

  2. OPMILL - MICRO COMPUTER PROGRAMMING ENVIRONMENT FOR CNC MILLING MACHINES THREE AXIS EQUATION PLOTTING CAPABILITIES

    NASA Technical Reports Server (NTRS)

    Ray, R. B.

    1994-01-01

    OPMILL is a computer operating system for a Kearney and Trecker milling machine that provides a fast and easy way to program machine part manufacture with an IBM compatible PC. The program gives the machinist an "equation plotter" feature which plots any set of equations that define axis moves (up to three axes simultaneously) and converts those equations to a machine milling program that will move a cutter along a defined path. Other supported functions include: drill with peck, bolt circle, tap, mill arc, quarter circle, circle, circle 2 pass, frame, frame 2 pass, rotary frame, pocket, loop and repeat, and copy blocks. The system includes a tool manager that can handle up to 25 tools and automatically adjusts tool length for each tool. It will display all tool information and stop the milling machine at the appropriate time. Information for the program is entered via a series of menus and compiled to the Kearney and Trecker format. The program can then be loaded into the milling machine, the tool path graphically displayed, and tool change information or the program in Kearney and Trecker format viewed. The program has a complete file handling utility that allows the user to load the program into memory from the hard disk, save the program to the disk with comments, view directories, merge a program on the disk with one in memory, save a portion of a program in memory, and change directories. OPMILL was developed on an IBM PS/2 running DOS 3.3 with 1 MB of RAM. OPMILL was written for an IBM PC or compatible 8088 or 80286 machine connected via an RS-232 port to a Kearney and Trecker Data Mill 700/C Control milling machine. It requires a "D:" drive (fixed-disk or virtual), a browse or text display utility, and an EGA or better display. Users wishing to modify and recompile the source code will also need Turbo BASIC, Turbo C, and Crescent Software's QuickPak for Turbo BASIC. IBM PC and IBM PS/2 are registered trademarks of International Business Machines. Turbo BASIC and Turbo C are trademarks of Borland International.

  3. Large scale track analysis for wide area motion imagery surveillance

    NASA Astrophysics Data System (ADS)

    van Leeuwen, C. J.; van Huis, J. R.; Baan, J.

    2016-10-01

    Wide Area Motion Imagery (WAMI) enables image based surveillance of areas that can cover multiple square kilometers. Interpreting and analyzing information from such sources, becomes increasingly time consuming as more data is added from newly developed methods for information extraction. Captured from a moving Unmanned Aerial Vehicle (UAV), the high-resolution images allow detection and tracking of moving vehicles, but this is a highly challenging task. By using a chain of computer vision detectors and machine learning techniques, we are capable of producing high quality track information of more than 40 thousand vehicles per five minutes. When faced with such a vast number of vehicular tracks, it is useful for analysts to be able to quickly query information based on region of interest, color, maneuvers or other high-level types of information, to gain insight and find relevant activities in the flood of information. In this paper we propose a set of tools, combined in a graphical user interface, which allows data analysts to survey vehicles in a large observed area. In order to retrieve (parts of) images from the high-resolution data, we developed a multi-scale tile-based video file format that allows to quickly obtain only a part, or a sub-sampling of the original high resolution image. By storing tiles of a still image according to a predefined order, we can quickly retrieve a particular region of the image at any relevant scale, by skipping to the correct frames and reconstructing the image. Location based queries allow a user to select tracks around a particular region of interest such as landmark, building or street. By using an integrated search engine, users can quickly select tracks that are in the vicinity of locations of interest. Another time-reducing method when searching for a particular vehicle, is to filter on color or color intensity. Automatic maneuver detection adds information to the tracks that can be used to find vehicles based on their behavior.

  4. SequenceCEROSENE: a computational method and web server to visualize spatial residue neighborhoods at the sequence level.

    PubMed

    Heinke, Florian; Bittrich, Sebastian; Kaiser, Florian; Labudde, Dirk

    2016-01-01

    To understand the molecular function of biopolymers, studying their structural characteristics is of central importance. Graphics programs are often utilized to conceive these properties, but with the increasing number of available structures in databases or structure models produced by automated modeling frameworks this process requires assistance from tools that allow automated structure visualization. In this paper a web server and its underlying method for generating graphical sequence representations of molecular structures is presented. The method, called SequenceCEROSENE (color encoding of residues obtained by spatial neighborhood embedding), retrieves the sequence of each amino acid or nucleotide chain in a given structure and produces a color coding for each residue based on three-dimensional structure information. From this, color-highlighted sequences are obtained, where residue coloring represent three-dimensional residue locations in the structure. This color encoding thus provides a one-dimensional representation, from which spatial interactions, proximity and relations between residues or entire chains can be deduced quickly and solely from color similarity. Furthermore, additional heteroatoms and chemical compounds bound to the structure, like ligands or coenzymes, are processed and reported as well. To provide free access to SequenceCEROSENE, a web server has been implemented that allows generating color codings for structures deposited in the Protein Data Bank or structure models uploaded by the user. Besides retrieving visualizations in popular graphic formats, underlying raw data can be downloaded as well. In addition, the server provides user interactivity with generated visualizations and the three-dimensional structure in question. Color encoded sequences generated by SequenceCEROSENE can aid to quickly perceive the general characteristics of a structure of interest (or entire sets of complexes), thus supporting the researcher in the initial phase of structure-based studies. In this respect, the web server can be a valuable tool, as users are allowed to process multiple structures, quickly switch between results, and interact with generated visualizations in an intuitive manner. The SequenceCEROSENE web server is available at https://biosciences.hs-mittweida.de/seqcerosene.

  5. An efficient parallel termination detection algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, A. H.; Crivelli, S.; Jessup, E. R.

    2004-05-27

    Information local to any one processor is insufficient to monitor the overall progress of most distributed computations. Typically, a second distributed computation for detecting termination of the main computation is necessary. In order to be a useful computational tool, the termination detection routine must operate concurrently with the main computation, adding minimal overhead, and it must promptly and correctly detect termination when it occurs. In this paper, we present a new algorithm for detecting the termination of a parallel computation on distributed-memory MIMD computers that satisfies all of those criteria. A variety of termination detection algorithms have been devised. Ofmore » these, the algorithm presented by Sinha, Kale, and Ramkumar (henceforth, the SKR algorithm) is unique in its ability to adapt to the load conditions of the system on which it runs, thereby minimizing the impact of termination detection on performance. Because their algorithm also detects termination quickly, we consider it to be the most efficient practical algorithm presently available. The termination detection algorithm presented here was developed for use in the PMESC programming library for distributed-memory MIMD computers. Like the SKR algorithm, our algorithm adapts to system loads and imposes little overhead. Also like the SKR algorithm, ours is tree-based, and it does not depend on any assumptions about the physical interconnection topology of the processors or the specifics of the distributed computation. In addition, our algorithm is easier to implement and requires only half as many tree traverses as does the SKR algorithm. This paper is organized as follows. In section 2, we define our computational model. In section 3, we review the SKR algorithm. We introduce our new algorithm in section 4, and prove its correctness in section 5. We discuss its efficiency and present experimental results in section 6.« less

  6. The iPhyClassifier, an interactive online tool for phytoplasma classification and taxonomic assignment

    USDA-ARS?s Scientific Manuscript database

    The iPhyClassifier is an Internet-based research tool for quick identification and classification of diverse phytoplasmas. The iPhyClassifier simulates laboratory restriction enzyme digestions and subsequent gel electrophoresis and generates virtual restriction fragment length polymorphism (RFLP) p...

  7. Rapid Benefit Indicator (RBI) Checklist Tool - Quick Start Manual

    EPA Science Inventory

    The Rapid Benefits Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration – A Rapid Benefits Indicators Approach for Decision Makers. This checklist tool is intended to be used to record information as you answer the ques...

  8. Routes to new networks : a guide to social media for the public transportation industry

    DOT National Transportation Integrated Search

    2009-11-01

    Today, utilization of many social media tools is seen as an added-value opportunity. AS society transforms, these tools will quickly become a necessity for those looking to communicate. It will be imperative that you are equipped with the knowledge a...

  9. Globus Quick Start Guide. Globus Software Version 1.1

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.

  10. Analysis of Selected Enhancements to the En Route Central Computing Complex

    DOT National Transportation Integrated Search

    1981-09-01

    This report analyzes selected hardware enhancements that could improve the performance of the 9020 computer systems, which are used to provide en route air traffic control services. These enhancements could be implemented quickly, would be relatively...

  11. Quick fuzzy backpropagation algorithm.

    PubMed

    Nikov, A; Stoeva, S

    2001-03-01

    A modification of the fuzzy backpropagation (FBP) algorithm called QuickFBP algorithm is proposed, where the computation of the net function is significantly quicker. It is proved that the FBP algorithm is of exponential time complexity, while the QuickFBP algorithm is of polynomial time complexity. Convergence conditions of the QuickFBP, resp. the FBP algorithm are defined and proved for: (1) single output neural networks in case of training patterns with different targets; and (2) multiple output neural networks in case of training patterns with equivalued target vector. They support the automation of the weights training process (quasi-unsupervised learning) establishing the target value(s) depending on the network's input values. In these cases the simulation results confirm the convergence of both algorithms. An example with a large-sized neural network illustrates the significantly greater training speed of the QuickFBP rather than the FBP algorithm. The adaptation of an interactive web system to users on the basis of the QuickFBP algorithm is presented. Since the QuickFBP algorithm ensures quasi-unsupervised learning, this implies its broad applicability in areas of adaptive and adaptable interactive systems, data mining, etc. applications.

  12. A decision-support tool for the control of urban noise pollution.

    PubMed

    Suriano, Marcia Thais; de Souza, Léa Cristina Lucas; da Silva, Antonio Nelson Rodrigues

    2015-07-01

    Improving the quality of life is increasingly seen as an important urban planning goal. In order to reach it, various tools are being developed to mitigate the negative impacts of human activities on society. This paper develops a methodology for quantifying the population's exposure to noise, by proposing a classification of urban blocks. Taking into account the vehicular flow and traffic composition of the surroundings of urban blocks, we generated a noise map by applying a computational simulation. The urban blocks were classified according to their noise range and then the population was estimated for each urban block, by a process which was based on the census tract and the constructed area of the blocks. The acoustical classes of urban blocks and the number of inhabitants per block were compared, so that the population exposed to noise levels above 65 dB(A) could be estimated, which is the highest limit established by legislation. As a result, we developed a map of the study area, so that urban blocks that should be priority targets for noise mitigation actions can be quickly identified.

  13. Using Web 2.0 (and Beyond?) in Space Flight Operations Control Centers

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    Word processing was one of the earliest uses for small workstations, but we quickly learned that desktop computers were far more than e-typewriters. Similarly, "Web 2.0" capabilities, particularly advanced search engines, chats, wikis, blogs, social networking, and the like, offer tools that could significantly improve our efficiency at managing the avalanche of information and decisions needed to operate space vehicles in realtime. However, could does not necessarily equal should. We must wield two-edged swords carefully to avoid stabbing ourselves. This paper examines some Web 2.0 tools, with an emphasis on social media, and suggests which ones might be useful or harmful in real-time space operations co rnotl environments, based on the author s experience as a Payload Crew Communicator (PAYCOM) at Marshall Space Flight Center s (MSFC) Payload Operations Integration Center (POIC) for the International Space Station (ISS) and on discussions with other space flight operations control organizations and centers. There is also some discussion of an offering or two that may come from beyond the current cyber-horizon.

  14. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  15. Computer Virus Protection

    ERIC Educational Resources Information Center

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  16. Joint resonant CMB power spectrum and bispectrum estimation

    NASA Astrophysics Data System (ADS)

    Meerburg, P. Daniel; Münchmeyer, Moritz; Wandelt, Benjamin

    2016-02-01

    We develop the tools necessary to assess the statistical significance of resonant features in the CMB correlation functions, combining power spectrum and bispectrum measurements. This significance is typically addressed by running a large number of simulations to derive the probability density function (PDF) of the feature-amplitude in the Gaussian case. Although these simulations are tractable for the power spectrum, for the bispectrum they require significant computational resources. We show that, by assuming that the PDF is given by a multivariate Gaussian where the covariance is determined by the Fisher matrix of the sine and cosine terms, we can efficiently produce spectra that are statistically close to those derived from full simulations. By drawing a large number of spectra from this PDF, both for the power spectrum and the bispectrum, we can quickly determine the statistical significance of candidate signatures in the CMB, considering both single frequency and multifrequency estimators. We show that for resonance models, cosmology and foreground parameters have little influence on the estimated amplitude, which allows us to simplify the analysis considerably. A more precise likelihood treatment can then be applied to candidate signatures only. We also discuss a modal expansion approach for the power spectrum, aimed at quickly scanning through large families of oscillating models.

  17. Validation of a Pressure-Based Combustion Simulation Tool Using a Single Element Injector Test Problem

    NASA Technical Reports Server (NTRS)

    Thakur, Siddarth; Wright, Jeffrey

    2006-01-01

    The traditional design and analysis practice for advanced propulsion systems, particularly chemical rocket engines, relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment by non-CFD specialists. A computational tool, called Loci-STREAM is being developed for this purpose. It is a pressure-based, Reynolds-averaged Navier-Stokes (RANS) solver for generalized unstructured grids, which is designed to handle all-speed flows (incompressible to hypersonic) and is particularly suitable for solving multi-species flow in fixed-frame combustion devices. Loci-STREAM integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective of the ongoing work is to develop a robust simulation capability for combustion problems in rocket engines. As an initial step towards validating this capability, a model problem is investigated in the present study which involves a gaseous oxygen/gaseous hydrogen (GO2/GH2) shear coaxial single element injector, for which experimental data are available. The sensitivity of the computed solutions to grid density, grid distribution, different turbulence models, and different near-wall treatments is investigated. A refined grid, which is clustered in the vicinity of the solid walls as well as the flame, is used to obtain a steady state solution which may be considered as the best solution attainable with the steady-state RANS methodology. From a design point of view, quick turnaround times are desirable; with this in mind, coarser grids are also employed and the resulting solutions are evaluated with respect to the fine grid solution.

  18. Supporting geoscience with graphical-user-interface Internet tools for the Macintosh

    NASA Astrophysics Data System (ADS)

    Robin, Bernard

    1995-07-01

    This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.

  19. SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karthik, Rajasekar; Lu, Wei

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operationmore » purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban emergency evacuation framework that can improve traffic mobility and safety under critical infrastructure disruption in today s socially connected world.« less

  20. Modified Linear Theory Aircraft Design Tools and Sonic Boom Minimization Strategy Applied to Signature Freezing via F-function Lobe Balancing

    NASA Astrophysics Data System (ADS)

    Jung, Timothy Paul

    Commercial supersonic travel has strong business potential; however, in order for the Federal Aviation Administration to lift its ban on supersonic flight overland, designers must reduce aircraft sonic boom strength to an acceptable level. An efficient methodology and associated tools for designing aircraft for minimized sonic booms are presented. The computer-based preliminary design tool, RapidF, based on modified linear theory, enables quick assessment of an aircraft's sonic boom with run times less than 30 seconds on a desktop computer. A unique feature of RapidF is that it tracks where on the aircraft each segment of the of the sonic boom came from, enabling precise modifications, speeding the design process. Sonic booms from RapidF are compared to flight test data, showing that it is capability of predicting a sonic boom duration, overpressure, and interior shock locations. After the preliminary design is complete, scaled flight tests should be conducted to validate the low boom design. When conducting such tests, it is insufficient to just scale the length; thus, equations to scale the weight and propagation distance are derived. Using RapidF, a conceptual supersonic business jet design is presented that uses F-function lobe balancing to create a frozen sonic boom using lifting surfaces. The leading shock is reduced from 1.4 to 0.83 psf, and the trailing shock from 1.2 to 0.87 psf, 41% and 28% reductions respectfully. By changing the incidence angle of the surfaces, different sonic boom shapes can be created, and allowing the lobes to be re-balanced for new flight conditions. Computational fluid dynamics is conducted to validate the sonic boom predictions. Off-design analysis is presented that varies weight, altitude, Mach number, and propagation angle, demonstrating that lobe-balance is robust. Finally, the Perceived Level of Loudness metric is analyzed, resulting in a modified design that incorporates other boom minimization techniques to further reduce the sonic boom.

  1. Program helps quickly calculate deviated well path

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, M.P.

    1993-11-22

    A BASIC computer program quickly calculates the angle and measured depth of a simple directional well given only the true vertical depth and total displacement of the target. Many petroleum engineers and geologists need a quick, easy method to calculate the angle and measured depth necessary to reach a target in a proposed deviated well bore. Too many of the existing programs are large and require much input data. The drilling literature is full of equations and methods to calculate the course of well paths from surveys taken after a well is drilled. Very little information, however, covers how tomore » calculate well bore trajectories for proposed wells from limited data. Furthermore, many of the equations are quite complex and difficult to use. A figure lists a computer program with the equations to calculate the well bore trajectory necessary to reach a given displacement and true vertical depth (TVD) for a simple build plant. It can be run on an IBM compatible computer with MS-DOS version 5 or higher, QBasic, or any BASIC that does no require line numbers. QBasic 4.5 compiler will also run the program. The equations are based on conventional geometry and trigonometry.« less

  2. Tool simplifies machining of pipe ends for precision welding

    NASA Technical Reports Server (NTRS)

    Matus, S. T.

    1969-01-01

    Single tool prepares a pipe end for precision welding by simultaneously performing internal machining, end facing, and bevel cutting to specification standards. The machining operation requires only one milling adjustment, can be performed quickly, and produces the high quality pipe-end configurations required to ensure precision-welded joints.

  3. Internet Tools Access Administrative Data at the University of Delaware.

    ERIC Educational Resources Information Center

    Jacobson, Carl

    1995-01-01

    At the University of Delaware, World Wide Web tools are used to produce multiplatform administrative applications, including hyperreporting, mixed media, electronic forms, and kiosk services. Web applications are quickly and easily crafted to interact with administrative databases. They are particularly suited to customer outreach efforts,…

  4. Minimizing Security Vulnerabilities in High-Tech Classrooms

    ERIC Educational Resources Information Center

    Ozkan, Betul C.; Gunay, Vedat

    2004-01-01

    Emerging technologies are quickly becoming part of daily learning and teaching endeavors in academia. Due to the access to certain high-tech tools educators must learn how to integrate these tools in educational settings. However, many also encounter problems and weaknesses in the same high-tech environment that uses and delivers information…

  5. ARTVAL user guide : user guide for the ARTerial eVALuation computational engine.

    DOT National Transportation Integrated Search

    2015-06-01

    This document provides guidance on the use of the ARTVAL (Arterial Evaluation) computational : engine. The engine implements the Quick Estimation Method for Urban Streets (QEM-US) : described in Highway Capacity Manual (HCM2010) as the core computati...

  6. Hybrid automated reliability predictor integrated work station (HiREL)

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1991-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated reliability (HiREL) workstation tool system marks another step toward the goal of producing a totally integrated computer aided design (CAD) workstation design capability. Since a reliability engineer must generally graphically represent a reliability model before he can solve it, the use of a graphical input description language increases productivity and decreases the incidence of error. The captured image displayed on a cathode ray tube (CRT) screen serves as a documented copy of the model and provides the data for automatic input to the HARP reliability model solver. The introduction of dependency gates to a fault tree notation allows the modeling of very large fault tolerant system models using a concise and visually recognizable and familiar graphical language. In addition to aiding in the validation of the reliability model, the concise graphical representation presents company management, regulatory agencies, and company customers a means of expressing a complex model that is readily understandable. The graphical postprocessor computer program HARPO (HARP Output) makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes.

  7. Analysis of metabolomics datasets with high-performance computing and metabolite atlases

    DOE PAGES

    Yao, Yushu; Sun, Terence; Wang, Tony; ...

    2015-07-20

    Even with the widespread use of liquid chromatography mass spectrometry (LC/MS) based metabolomics, there are still a number of challenges facing this promising technique. Many, diverse experimental workflows exist; yet there is a lack of infrastructure and systems for tracking and sharing of information. Here, we describe the Metabolite Atlas framework and interface that provides highly-efficient, web-based access to raw mass spectrometry data in concert with assertions about chemicals detected to help address some of these challenges. This integration, by design, enables experimentalists to explore their raw data, specify and refine features annotations such that they can be leveraged formore » future experiments. Fast queries of the data through the web using SciDB, a parallelized database for high performance computing, make this process operate quickly. Furthermore, by using scripting containers, such as IPython or Jupyter, to analyze the data, scientists can utilize a wide variety of freely available graphing, statistics, and information management resources. In addition, the interfaces facilitate integration with systems biology tools to ultimately link metabolomics data with biological models.« less

  8. WaveJava: Wavelet-based network computing

    NASA Astrophysics Data System (ADS)

    Ma, Kun; Jiao, Licheng; Shi, Zhuoer

    1997-04-01

    Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.

  9. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  10. Morphological imaging and quantification of axial xylem tissue in Fraxinus excelsior L. through X-ray micro-computed tomography.

    PubMed

    Koddenberg, Tim; Militz, Holger

    2018-05-05

    The popularity of X-ray based imaging methods has continued to increase in research domains. In wood research, X-ray micro-computed tomography (XμCT) is useful for structural studies examining the three-dimensional and complex xylem tissue of trees qualitatively and quantitatively. In this study, XμCT made it possible to visualize and quantify the spatial xylem organization of the angiosperm species Fraxinus excelsior L. on the microscopic level. Through image analysis, it was possible to determine morphological characteristics of the cellular axial tissue (vessel elements, fibers, and axial parenchyma cells) three-dimensionally. X-ray imaging at high resolutions provides very distinct visual insight into the xylem structure. Numerical analyses performed through semi-automatic procedures made it possible to quickly quantify cell characteristics (length, diameter, and volume of cells). Use of various spatial resolutions (0.87-5 μm) revealed boundaries users should be aware of. Nevertheless, our findings, both qualitative and quantitative, demonstrate XμCT to be a valuable tool for studying the spatial cell morphology of F. excelsior. Copyright © 2018. Published by Elsevier Ltd.

  11. A Bayesian comparative effectiveness trial in action: developing a platform for multisite study adaptive randomization.

    PubMed

    Brown, Alexandra R; Gajewski, Byron J; Aaronson, Lauren S; Mudaranthakam, Dinesh Pal; Hunt, Suzanne L; Berry, Scott M; Quintana, Melanie; Pasnoor, Mamatha; Dimachkie, Mazen M; Jawdat, Omar; Herbelin, Laura; Barohn, Richard J

    2016-08-31

    In the last few decades, the number of trials using Bayesian methods has grown rapidly. Publications prior to 1990 included only three clinical trials that used Bayesian methods, but that number quickly jumped to 19 in the 1990s and to 99 from 2000 to 2012. While this literature provides many examples of Bayesian Adaptive Designs (BAD), none of the papers that are available walks the reader through the detailed process of conducting a BAD. This paper fills that gap by describing the BAD process used for one comparative effectiveness trial (Patient Assisted Intervention for Neuropathy: Comparison of Treatment in Real Life Situations) that can be generalized for use by others. A BAD was chosen with efficiency in mind. Response-adaptive randomization allows the potential for substantially smaller sample sizes, and can provide faster conclusions about which treatment or treatments are most effective. An Internet-based electronic data capture tool, which features a randomization module, facilitated data capture across study sites and an in-house computation software program was developed to implement the response-adaptive randomization. A process for adapting randomization with minimal interruption to study sites was developed. A new randomization table can be generated quickly and can be seamlessly integrated in the data capture tool with minimal interruption to study sites. This manuscript is the first to detail the technical process used to evaluate a multisite comparative effectiveness trial using adaptive randomization. An important opportunity for the application of Bayesian trials is in comparative effectiveness trials. The specific case study presented in this paper can be used as a model for conducting future clinical trials using a combination of statistical software and a web-based application. ClinicalTrials.gov Identifier: NCT02260388 , registered on 6 October 2014.

  12. Evaluation of High Resolution Imagery and Elevation Data

    DTIC Science & Technology

    2009-06-01

    the value of cutting-edge geospatial tools while keeping the data constant, the present experiment evaluated the effect of higher resolution imagery...and elevation data while keeping the tools constant. The high resolution data under evaluation was generated from TEC’s Buckeye system, an...results. As researchers and developers provide increasingly advanced tools to process data more quickly and accurately, it is necessary to assess each

  13. Automatic Tension Adjuster For Flexible-Shaft Grinder

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.; Hoult, William S.

    1990-01-01

    Flexible shaft of grinding tool automatically maintained in tension by air pressure. Probelike tool bent to reach hard-to-reach areas for grinding and polishing. Unless shaft held in tension, however, it rubs against its sheath, overheating and wearing out quickly. By taking up slack in flexible cable, tension adjuster reduces friction and enables tool to operate more efficiently, in addition to lengthening operating life.

  14. Research and evaluation of biomass resources/conversion/utilization systems. Biomass allocation model. Volume 1: Test and appendices A & B

    NASA Astrophysics Data System (ADS)

    Stringer, R. P.; Ahn, Y. K.; Chen, H. T.; Helm, R. W.; Nelson, E. T.; Shields, K. J.

    1981-08-01

    A biomass allocation model was developed to show the most profitable combination of biomass feedstocks, thermochemical conversion processes, and fuel products to serve the seasonal conditions in a regional market. This optimization model provides a tool for quickly calculating which of a large number of potential biomass missions is the most profitable mission. Other components of the system serve as a convenient storage and retrieval mechanism for biomass marketing and thermochemical conversion processing data. The system can be accessed through the use of a computer terminal, or it could be adapted to a microprocessor. A User's Manual for the system is included. Biomass derived fuels included in the data base are the following: medium Btu gas, low Btu gas, substitute natural gas, ammonia, methanol, electricity, gasoline, and fuel oil.

  15. Titian: Data Provenance Support in Spark

    PubMed Central

    Interlandi, Matteo; Shah, Kshitij; Tetali, Sai Deep; Gulzar, Muhammad Ali; Yoo, Seunghyun; Kim, Miryung; Millstein, Todd; Condie, Tyson

    2015-01-01

    Debugging data processing logic in Data-Intensive Scalable Computing (DISC) systems is a difficult and time consuming effort. Today’s DISC systems offer very little tooling for debugging programs, and as a result programmers spend countless hours collecting evidence (e.g., from log files) and performing trial and error debugging. To aid this effort, we built Titian, a library that enables data provenance—tracking data through transformations—in Apache Spark. Data scientists using the Titian Spark extension will be able to quickly identify the input data at the root cause of a potential bug or outlier result. Titian is built directly into the Spark platform and offers data provenance support at interactive speeds—orders-of-magnitude faster than alternative solutions—while minimally impacting Spark job performance; observed overheads for capturing data lineage rarely exceed 30% above the baseline job execution time. PMID:26726305

  16. The Biological Relevance of Artificial Life: Lessons from Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano

    2000-01-01

    There is no fundamental reason why A-life couldn't simply be a branch of computer science that deals with algorithms that are inspired by, or emulate biological phenomena. However, if these are the limits we place on this field, we miss the opportunity to help advance Theoretical Biology and to contribute to a deeper understanding of the nature of life. The history of Artificial Intelligence provides a good example, in that early interest in the nature of cognition quickly was lost to the process of building tools, such as "expert systems" that, were certainly useful, but provided little insight in the nature of cognition. Based on this lesson, I will discuss criteria for increasing the biological relevance of A-life and the probability that this field may provide a theoretical foundation for Biology.

  17. QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN

    NASA Technical Reports Server (NTRS)

    Schlaifer, R. S.

    1994-01-01

    QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL subroutines can be made to output to the Calcomp 1055 plotter, the Hewlett-Packard 2648 graphics terminal, the HP 7221, 7475 and 7550 pen plotters, the Tektronix 40xx and 41xx series graphics terminals, the DEC VT125/VT240 graphics terminals, the QMS 800 laser printer, the Sun Microsystems monochrome display, the Ridge Computers monochrome display, the IBM/PC color display, or a "dumb" terminal or printer. Programs using this library can be written so that they always use the same type of plotter or they can allow the choice of plotter type to be deferred until after program execution. QUICK is written in RATFOR for use on Sun4 series computers running SunOS. No source code is provided. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in ASCII format is included on the distribution medium. QUICK was developed in 1991 and is a copyrighted work with all copyright vested in NASA.

  18. AutoCAD-To-NASTRAN Translator Program

    NASA Technical Reports Server (NTRS)

    Jones, A.

    1989-01-01

    Program facilitates creation of finite-element mathematical models from geometric entities. AutoCAD to NASTRAN translator (ACTON) computer program developed to facilitate quick generation of small finite-element mathematical models for use with NASTRAN finite-element modeling program. Reads geometric data of drawing from Data Exchange File (DXF) used in AutoCAD and other PC-based drafting programs. Written in Microsoft Quick-Basic (Version 2.0).

  19. A Parallel Numerical Micromagnetic Code Using FEniCS

    NASA Astrophysics Data System (ADS)

    Nagy, L.; Williams, W.; Mitchell, L.

    2013-12-01

    Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.

  20. Computational Exposure Science: An Emerging Discipline to Support 21st-Century Risk Assessment

    EPA Science Inventory

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elem...

  1. Experiences with Efficient Methodologies for Teaching Computer Programming to Geoscientists

    ERIC Educational Resources Information Center

    Jacobs, Christian T.; Gorman, Gerard J.; Rees, Huw E.; Craig, Lorraine E.

    2016-01-01

    Computer programming was once thought of as a skill required only by professional software developers. But today, given the ubiquitous nature of computation and data science it is quickly becoming necessary for all scientists and engineers to have at least a basic knowledge of how to program. Teaching how to program, particularly to those students…

  2. Pocket-sized versus standard ultrasound machines in abdominal imaging.

    PubMed

    Tse, K H; Luk, W H; Lam, M C

    2014-06-01

    The pocket-sized ultrasound machine has emerged as an invaluable tool for quick assessment in emergency and general practice settings. It is suitable for instant and quick assessment in cardiac imaging. However, its applicability in the imaging of other body parts has yet to be established. In this pictorial review, we compared the performance of the pocketsized ultrasound machine against the standard ultrasound machine for its image quality in common abdominal pathology.

  3. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etmektzoglou, A; Mishra, P; Svatos, M

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less

  4. Comparison of Performance Predictions for New Low-Thrust Trajectory Tools

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie

    2006-01-01

    Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.

  5. Experimental validation of convection-diffusion discretisation scheme employed for computational modelling of biological mass transport

    PubMed Central

    2010-01-01

    Background The finite volume solver Fluent (Lebanon, NH, USA) is a computational fluid dynamics software employed to analyse biological mass-transport in the vasculature. A principal consideration for computational modelling of blood-side mass-transport is convection-diffusion discretisation scheme selection. Due to numerous discretisation schemes available when developing a mass-transport numerical model, the results obtained should either be validated against benchmark theoretical solutions or experimentally obtained results. Methods An idealised aneurysm model was selected for the experimental and computational mass-transport analysis of species concentration due to its well-defined recirculation region within the aneurysmal sac, allowing species concentration to vary slowly with time. The experimental results were obtained from fluid samples extracted from a glass aneurysm model, using the direct spectrophometric concentration measurement technique. The computational analysis was conducted using the four convection-diffusion discretisation schemes available to the Fluent user, including the First-Order Upwind, the Power Law, the Second-Order Upwind and the Quadratic Upstream Interpolation for Convective Kinetics (QUICK) schemes. The fluid has a diffusivity of 3.125 × 10-10 m2/s in water, resulting in a Peclet number of 2,560,000, indicating strongly convection-dominated flow. Results The discretisation scheme applied to the solution of the convection-diffusion equation, for blood-side mass-transport within the vasculature, has a significant influence on the resultant species concentration field. The First-Order Upwind and the Power Law schemes produce similar results. The Second-Order Upwind and QUICK schemes also correlate well but differ considerably from the concentration contour plots of the First-Order Upwind and Power Law schemes. The computational results were then compared to the experimental findings. An average error of 140% and 116% was demonstrated between the experimental results and those obtained from the First-Order Upwind and Power Law schemes, respectively. However, both the Second-Order upwind and QUICK schemes accurately predict species concentration under high Peclet number, convection-dominated flow conditions. Conclusion Convection-diffusion discretisation scheme selection has a strong influence on resultant species concentration fields, as determined by CFD. Furthermore, either the Second-Order or QUICK discretisation schemes should be implemented when numerically modelling convection-dominated mass-transport conditions. Finally, care should be taken not to utilize computationally inexpensive discretisation schemes at the cost of accuracy in resultant species concentration. PMID:20642816

  6. DarkBit: a GAMBIT module for computing dark matter observables and likelihoods

    NASA Astrophysics Data System (ADS)

    Bringmann, Torsten; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Kahlhoefer, Felix; Kvellestad, Anders; Putze, Antje; Savage, Christopher; Scott, Pat; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-12-01

    We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments ( gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments ( DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool ( GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes ( DarkSUSY and micrOMEGAs), and application of DarkBit 's advanced direct and indirect detection routines to a simple effective dark matter model.

  7. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  8. Rapid E-Learning Simulation Training and User Response

    ERIC Educational Resources Information Center

    Rackler, Angeline

    2011-01-01

    A new trend in e-learning development is to have subject matter experts use rapid development tools to create training simulations. This type of training is called rapid e-learning simulation training. Though companies are using rapid development tools to create training quickly and cost effectively, there is little empirical research to indicate…

  9. Research Tools, Tips, and Resources for Financial Aid Administrators. Monograph, A NASFAA Series.

    ERIC Educational Resources Information Center

    Mohning, David D.; Redd, Kenneth E.; Simmons, Barry W., Sr.

    This monograph provides research tools, tips, and resources to financial aid administrators who need to undertake research tasks. It answers: What is research? How can financial aid administrators get started on research projects? What resources are available to help answer research questions quickly and accurately? How can research efforts assist…

  10. Practicing evidence based medicine at the bedside: a randomized controlled pilot study in undergraduate medical students assessing the practicality of tablets, smartphones, and computers in clinical life.

    PubMed

    Friederichs, Hendrik; Marschall, Bernhard; Weissenstein, Anne

    2014-12-05

    Practicing evidence-based medicine is an important aspect of providing good medical care. Accessing external information through literature searches on computer-based systems can effectively achieve integration in clinical care. We conducted a pilot study using smartphones, tablets, and stationary computers as search devices at the bedside. The objective was to determine possible differences between the various devices and assess students' internet use habits. In a randomized controlled pilot study, 120 students were divided in three groups. One control group solved clinical problems on a computer and two intervention groups used mobile devices at the bedside. In a questionnaire, students were asked to report their internet use habits as well as their satisfaction with their respective search tool using a 5-point Likert scale. Of 120 surveys, 94 (78.3%) complete data sets were analyzed. The mobility of the tablet (3.90) and the smartphone (4.39) was seen as a significant advantage over the computer (2.38, p < .001). However, for performing an effective literature search at the bedside, the computer (3.22) was rated superior to both tablet computers (2.13) and smartphones (1.68). No significant differences were detected between tablets and smartphones except satisfaction with screen size (tablet 4.10, smartphone 2.00, p < .001). Using a mobile device at the bedside to perform an extensive search is not suitable for students who prefer using computers. However, mobility is regarded as a substantial advantage, and therefore future applications might facilitate quick and simple searches at the bedside.

  11. Reconciling Scientific Aspirations and Engineering Constraints for a Lunar Mission via Hyperdimensional Interpolation

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Clark, Pamela; Elfes, Alberto; Smith, Jeffrey H.; Mrozinski, Joseph; Adumitroaie, Virgil; Hua, Hook; Shelton, Kacie; Lincoln, William; Silberg, Robert

    2010-01-01

    Virtually every NASA space-exploration mission represents a compromise between the interests of two expert, dedicated, but very different communities: scientists, who want to go quickly to the places that interest them most and spend as much time there as possible conducting sophisticated experiments, and the engineers and designers charged with maximizing the probability that a given mission will be successful and cost-effective. Recent work at NASA's Jet Propulsion Laboratory (JPL) seeks to enhance communication between these two groups, and to help them reconcile their interests, by developing advanced modeling capabilities with which they can analyze the achievement of science goals and objectives against engineering design and operational constraints. The analyses conducted prior to this study have been point-design driven. Each analysis has been of one hypothetical case which addresses the question: Given a set of constraints, how much science can be done? But the constraints imposed by the architecture team-e.g., rover speed, time allowed for extravehicular activity (EVA), number of sites at which science experiments are to be conducted- are all in early development and carry a great deal of uncertainty. Variations can be incorporated into the analysis, and indeed that has been done in sensitivity studies designed to see which constraint variations have the greatest impact on results. But if a very large number of variations can be analyzed all at once, producing a table that includes virtually the entire trade space under consideration, then we have a tool that enables scientists and mission architects to ask the inverse question: For a given desired level of science (or any other objective), what is the range of constraints that would be needed? With this tool, mission architects could determine, for example, what combinations of rover speed, EVA duration, and other constraints produce the desired results. Further, this tool would help them identify which technology-improvement investments would be likely to produce the largest or most important return. However, the number of variations that need to be considered for such analysis quickly balloons to an unwieldy size. If three variations are considered for each of six constraints-a very modest example-there are a total of 243 variations to consider. If it takes 40 minutes to compute each variation, as it does with HURON, our automated optimization system, then it would take 162 hours or nearly 7 days of round-the-clock computing to calculate the results. Adding further constraints or variations exponentially increases the amount of time that is needed.

  12. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  13. Modeling and Simulation Reliable Spacecraft On-Board Computing

    NASA Technical Reports Server (NTRS)

    Park, Nohpill

    1999-01-01

    The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.

  14. dataMares - An online platform for the fast, effective dissemination of science

    NASA Astrophysics Data System (ADS)

    Johnson, A. F.; Aburto-Oropeza, O.; Moreno-Báez, M.; Giron-Nava, A.; Lopez-Sagástegui, R.; Lopez-Sagástegui, C.

    2016-02-01

    One of the current challenges in public policy development, especially related to natural resource management and conservation, is that there are very few tools that help easily identify and incorporate relevant scientific findings and data into public policy. This can also lead to a repetition of research efforts and the collect of information that in some cases might already exist. The key to addressing this challenge is to develop collaborative research tools, which can be used by different sectors of society including key stakeholder groups, managers, policy makers and the public. Here we present an "open science" platform capable of handling large data and disseminating results to a wide audience quickly. dataMares uses business intelligence software to allow the dynamic presentation of data quickly to a range of users online. dataMares provides Robust and up-to-date scientific information for decision-makers, resource managers, conservation practitioners, fishers, community members, and regional and national level decision-makers in a nutshell. It can also be used in the training of young scientists and allows quick and open connections with the journalism industry.

  15. Computer-Based Internet-Hosted Assessment of L2 Literacy: Computerizing and Administering of the Oxford Quick Placement Test in ExamView and Moodle

    NASA Astrophysics Data System (ADS)

    Meurant, Robert C.

    Sorting of Korean English-as-a-Foreign-Language (EFL) university students by Second Language (L2) aptitude allocates students to classes of compatible ability level, and was here used to screen candidates for interview. Paper-and-pen versions of the Oxford Quick Placement Test were adapted to computer-based testing via online hosting using FSCreations ExamView. Problems with their online hosting site led to conversion to the popular computer-based learning management system Moodle, hosted on www.ninehub.com. 317 sophomores were tested online to encourage L2 digital literacy. Strategies for effective hybrid implementation of Learning Management Systems in L2 tertiary education include computer-based Internet-hosted L2 aptitude tests. These potentially provide a convenient measure of student progress in developing L2 fluency, and offer a more objective and relevant means of teacher- and course-assessment than student evaluations, which tend to confuse entertainment value and teacher popularity with academic credibility and pedagogical effectiveness.

  16. Measurement properties of the QuickDASH (disabilities of the arm, shoulder and hand) outcome measure and cross-cultural adaptations of the QuickDASH: a systematic review.

    PubMed

    Kennedy, Carol A; Beaton, Dorcas E; Smith, Peter; Van Eerd, Dwayne; Tang, Kenneth; Inrig, Taucha; Hogg-Johnson, Sheilah; Linton, Denise; Couban, Rachel

    2013-11-01

    To identify and synthesize evidence for the measurement properties of the QuickDASH, a shortened version of the 30-item DASH (Disabilities of the Arm, Shoulder and Hand) instrument. This systematic review used a best evidence synthesis approach to critically appraise the measurement properties [using COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN)] of the QuickDASH and cross-cultural adaptations. A standard search strategy was conducted between 2005 (year of first publication of QuickDASH) and March 2011 in MEDLINE, EMBASE and CINAHL. The search identified 14 studies to include in the best evidence synthesis of the QuickDASH. A further 11 studies were identified on eight cross-cultural adaptation versions. Many measurement properties of the QuickDASH have been evaluated in multiple studies and across most of the measurement properties. The best evidence synthesis of the QuickDASH English version suggests that this tool is performing well with strong positive evidence for reliability and validity (hypothesis testing), and moderate positive evidence for structural validity testing. Strong negative evidence was found for responsiveness due to lower correlations with global estimates of change. Information about the measurement properties of the cross-cultural adaptation versions is still lacking, or the available information is of poor overall methodological quality.

  17. Processing of on-board recorded data for quick analysis of aircraft performance. [rotor systems research aircraft

    NASA Technical Reports Server (NTRS)

    Michaud, N. H.

    1979-01-01

    A system of independent computer programs for the processing of digitized pulse code modulated (PCM) and frequency modulated (FM) data is described. Information is stored in a set of random files and accessed to produce both statistical and graphical output. The software system is designed primarily to present these reports within a twenty-four hour period for quick analysis of the helicopter's performance.

  18. Delineating landscape view areas...a computer approach

    Treesearch

    Elliot L. Amidon; Gary H. Elsner

    1968-01-01

    The terrain visible from a given point can be determined quick and efficiently by a computer. A FORTRAN subprogram--called VIEWIT--has been developed for this purpose. Input consists of data on elevations, by coordinates, which can be obtained from maps or ae rial photos. The computer will produce an overlay that shows the maximum area visible from an observation point...

  19. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments.

    PubMed

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-07-08

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. docBUILDER - Building Your Useful Metadata for Earth Science Data and Services.

    NASA Astrophysics Data System (ADS)

    Weir, H. M.; Pollack, J.; Olsen, L. M.; Major, G. R.

    2005-12-01

    The docBUILDER tool, created by NASA's Global Change Master Directory (GCMD), assists the scientific community in efficiently creating quality data and services metadata. Metadata authors are asked to complete five required fields to ensure enough information is provided for users to discover the data and related services they seek. After the metadata record is submitted to the GCMD, it is reviewed for semantic and syntactic consistency. Currently, two versions are available - a Web-based tool accessible with most browsers (docBUILDERweb) and a stand-alone desktop application (docBUILDERsolo). The Web version is available through the GCMD website, at http://gcmd.nasa.gov/User/authoring.html. This version has been updated and now offers: personalized templates to ease entering similar information for multiple data sets/services; automatic population of Data Center/Service Provider URLs based on the selected center/provider; three-color support to indicate required, recommended, and optional fields; an editable text window containing the XML record, to allow for quick editing; and improved overall performance and presentation. The docBUILDERsolo version offers the ability to create metadata records on a computer wherever you are. Except for installation and the occasional update of keywords, data/service providers are not required to have an Internet connection. This freedom will allow users with portable computers (Windows, Mac, and Linux) to create records in field campaigns, whether in Antarctica or the Australian Outback. This version also offers a spell-checker, in addition to all of the features found in the Web version.

  1. The Careful Puppet Master: Reducing risk and fortifying acceptance testing with Jenkins CI

    NASA Astrophysics Data System (ADS)

    Smith, Jason A.; Richman, Gabriel; DeStefano, John; Pryor, James; Rao, Tejas; Strecker-Kellogg, William; Wong, Tony

    2015-12-01

    Centralized configuration management, including the use of automation tools such as Puppet, can greatly increase provisioning speed and efficiency when configuring new systems or making changes to existing systems, reduce duplication of work, and improve automated processes. However, centralized management also brings with it a level of inherent risk: a single change in just one file can quickly be pushed out to thousands of computers and, if that change is not properly and thoroughly tested and contains an error, could result in catastrophic damage to many services, potentially bringing an entire computer facility offline. Change management procedures can—and should—be formalized in order to prevent such accidents. However, like the configuration management process itself, if such procedures are not automated, they can be difficult to enforce strictly. Therefore, to reduce the risk of merging potentially harmful changes into our production Puppet environment, we have created an automated testing system, which includes the Jenkins CI tool, to manage our Puppet testing process. This system includes the proposed changes and runs Puppet on a pool of dozens of RedHat Enterprise Virtualization (RHEV) virtual machines (VMs) that replicate most of our important production services for the purpose of testing. This paper describes our automated test system and how it hooks into our production approval process for automatic acceptance testing. All pending changes that have been pushed to production must pass this validation process before they can be approved and merged into production.

  2. A Computer System for Making Quick and Economical Color Slides.

    ERIC Educational Resources Information Center

    Pryor, Harold George

    1986-01-01

    A computer-based method for producing 35mm color slides has been used in Ohio State University's College of Dentistry. The method can produce both text and slides in less than two hours, providing substantial flexibility in planning and revising visual presentations. (Author/MLW)

  3. Rapid Benefit Indicator (RBI) Checklist Tool - Quick Start ...

    EPA Pesticide Factsheets

    The Rapid Benefits Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration – A Rapid Benefits Indicators Approach for Decision Makers. This checklist tool is intended to be used to record information as you answer the questions in that guide. When performing a Rapid Benefits Indicator (RBI) assessment on wetlands restoration site(s) results can be recorded and reviewed using this VBA enabled MS Excel Checklist Tool.

  4. Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.

    PubMed

    Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E

    2012-03-19

    A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them.

  5. Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community

    PubMed Central

    2012-01-01

    Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them. PMID:22429538

  6. π Scope: python based scientific workbench with visualization tool for MDSplus data

    NASA Astrophysics Data System (ADS)

    Shiraiwa, S.

    2014-10-01

    π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.

  7. The MaizeGDB Genome Browser Tutorial: One example of database outreach to biologists via video

    USDA-ARS?s Scientific Manuscript database

    Video tutorials are an effective way for researchers to quickly learn how to use online tools offered by biological databases. At the Maize Genetics and Genomics Database (MaizeGDB), we have developed a number of video tutorials that aim to demonstrate how to use various tools as well as to explici...

  8. A visual training tool for the Photoload sampling technique

    Treesearch

    Violet J. Holley; Robert E. Keane

    2010-01-01

    This visual training aid is designed to provide Photoload users a tool to increase the accuracy of fuel loading estimations when using the Photoload technique. The Photoload Sampling Technique (RMRS-GTR-190) provides fire managers a sampling method for obtaining consistent, accurate, inexpensive, and quick estimates of fuel loading. It is designed to require only one...

  9. A tool for rapid post-hurricane urban tree debris estimates using high resolution aerial imagery

    Treesearch

    Zoltan Szantoi; Sparkle L Malone; Francisco Escobedo; Orlando Misas; Scot Smith; Bon Dewitt

    2012-01-01

    Coastal communities in the southeast United States have regularly experienced severe hurricane impacts. To better facilitate recovery efforts in these communities following natural disasters, state and federal agencies must respond quickly with information regarding the extent and severity of hurricane damage and the amount of tree debris volume. A tool was developed...

  10. Masters of All They Survey

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2012-01-01

    This article discusses a survey tool, known as TechQual+, which gives IT leaders a quick and easy way to gauge their departments' performance on campus and learn what matters to their constituents, including faculty, students, and staff. The idea of an IT survey tool that can be used across higher education has its skeptics, who feel that colleges…

  11. Web-Based Phylogenetic Assignment Tool for Analysis of Terminal Restriction Fragment Length Polymorphism Profiles of Microbial Communities

    PubMed Central

    Kent, Angela D.; Smith, Dan J.; Benson, Barbara J.; Triplett, Eric W.

    2003-01-01

    Culture-independent DNA fingerprints are commonly used to assess the diversity of a microbial community. However, relating species composition to community profiles produced by community fingerprint methods is not straightforward. Terminal restriction fragment length polymorphism (T-RFLP) is a community fingerprint method in which phylogenetic assignments may be inferred from the terminal restriction fragment (T-RF) sizes through the use of web-based resources that predict T-RF sizes for known bacteria. The process quickly becomes computationally intensive due to the need to analyze profiles produced by multiple restriction digests and the complexity of profiles generated by natural microbial communities. A web-based tool is described here that rapidly generates phylogenetic assignments from submitted community T-RFLP profiles based on a database of fragments produced by known 16S rRNA gene sequences. Users have the option of submitting a customized database generated from unpublished sequences or from a gene other than the 16S rRNA gene. This phylogenetic assignment tool allows users to employ T-RFLP to simultaneously analyze microbial community diversity and species composition. An analysis of the variability of bacterial species composition throughout the water column in a humic lake was carried out to demonstrate the functionality of the phylogenetic assignment tool. This method was validated by comparing the results generated by this program with results from a 16S rRNA gene clone library. PMID:14602639

  12. Rosetta CONSERT operations and data analysis preparation: simulation software tools.

    NASA Astrophysics Data System (ADS)

    Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

    2014-05-01

    The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

  13. Integrating Computer Interfaced Videodisc Systems in Introductory College Biology.

    ERIC Educational Resources Information Center

    Ebert-Zawasky, Kathleen; Abegg, Gerald L.

    This study was designed as a systematic investigation of the feasibility and effectiveness of student authored videodisc presentations in a non-major introductory level college biology course. Students (n=66) used a quick-learn authoring system, the Macintosh computer, and videodisc player with color monitor. Results included: (1) students managed…

  14. Application of Cloud Computing at KTU: MS Live@Edu Case

    ERIC Educational Resources Information Center

    Miseviciene, Regina; Budnikas, Germanas; Ambraziene, Danute

    2011-01-01

    Cloud computing is a significant alternative in today's educational perspective. The technology gives the students and teachers the opportunity to quickly access various application platforms and resources through the web pages on-demand. Unfortunately, not all educational institutions often have an ability to take full advantages of the newest…

  15. Foresters' Metric Conversions program (version 1.0). [Computer program

    Treesearch

    Jefferson A. Palmer

    1999-01-01

    The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...

  16. Going Green Robots

    ERIC Educational Resources Information Center

    Nelson, Jacqueline M.

    2011-01-01

    In looking at the interesting shapes and sizes of old computer parts, creating robots quickly came to the author's mind. In this article, she describes how computer parts can be used creatively. Students will surely enjoy creating their very own robots while learning about the importance of recycling in the society. (Contains 1 online resource.)

  17. The Accuracy of Computer-Assisted Feedback and Students' Responses to It

    ERIC Educational Resources Information Center

    Lavolette, Elizabeth; Polio, Charlene; Kahng, Jimin

    2015-01-01

    Various researchers in second language acquisition have argued for the effectiveness of immediate rather than delayed feedback. In writing, truly immediate feedback is impractical, but computer-assisted feedback provides a quick way of providing feedback that also reduces the teacher's workload. We explored the accuracy of feedback from…

  18. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility wasmore » needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)« less

  19. Evaluating a 2D image-based computerized approach for measuring riverine pebble roundness

    NASA Astrophysics Data System (ADS)

    Cassel, Mathieu; Piégay, Hervé; Lavé, Jérôme; Vaudor, Lise; Hadmoko Sri, Danang; Wibiwo Budi, Sandy; Lavigne, Franck

    2018-06-01

    The geometrical characteristics of pebbles are important features to study transport pathways, sedimentary history, depositional environments, abrasion processes or to target sediment sources. Both the shape and roundness of pebbles can be described by a still growing number of metrics in 2D and 3D or by visual charts. Despite new developments, existing tools remain proprietary and no pebble roundness toolbox has been available widely within the scientific community. The toolbox developed by Roussillon et al. (2009) automatically computes the size, shape and roundness indexes of pebbles from their 2D maximal projection plans. Using a digital camera, this toolbox operates using 2D pictures taken of pebbles placed on a one square meter red board, allowing data collection to be quickly and efficiently acquired at a large scale. Now that the toolbox is freely available for download,

  20. Communication interference/jamming and propagation analysis system and its application to radio location

    NASA Astrophysics Data System (ADS)

    Kuzucu, H.

    1992-11-01

    Modern defense systems depend on comprehensive surveillance capability. The ability to detect and locate the radio signals is a major element of a surveillance system. With the increasing need for more mobile surveillance systems in conjunction with the rapid deployment of forces and the advent of technology allowing more enhanced use of small aperture systems, tactical direction finding (DF) and radiolocation systems will have to be operated in diverse operational conditions. A quick assessment of the error levels expected and the evaluation of the reliability of the fixes on the targeted areas bears crucial importance to the effectiveness of the missions relying on DF data. This paper presents a sophisticated, graphics workstation based computer tool developed for the system level analysis of radio communication systems and describes its use in radiolocation applications for realizing such accurate and realistic assessments with substantial money and time savings.

  1. Development of Northeast Asia Nuclear Power Plant Accident Simulator.

    PubMed

    Kim, Juyub; Kim, Juyoul; Po, Li-Chi Cliff

    2017-06-15

    A conclusion from the lessons learned after the March 2011 Fukushima Daiichi accident was that Korea needs a tool to estimate consequences from a major accident that could occur at a nuclear power plant located in a neighboring country. This paper describes a suite of computer-based codes to be used by Korea's nuclear emergency response staff for training and potentially operational support in Korea's national emergency preparedness and response program. The systems of codes, Northeast Asia Nuclear Accident Simulator (NANAS), consist of three modules: source-term estimation, atmospheric dispersion prediction and dose assessment. To quickly assess potential doses to the public in Korea, NANAS includes specific reactor data from the nuclear power plants in China, Japan and Taiwan. The completed simulator is demonstrated using data for a hypothetical release. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Generating Stimuli for Neuroscience Using PsychoPy.

    PubMed

    Peirce, Jonathan W

    2008-01-01

    PsychoPy is a software library written in Python, using OpenGL to generate very precise visual stimuli on standard personal computers. It is designed to allow the construction of as wide a variety of neuroscience experiments as possible, with the least effort. By writing scripts in standard Python syntax users can generate an enormous variety of visual and auditory stimuli and can interact with a wide range of external hardware (enabling its use in fMRI, EEG, MEG etc.). The structure of scripts is simple and intuitive. As a result, new experiments can be written very quickly, and trying to understand a previously written script is easy, even with minimal code comments. PsychoPy can also generate movies and image sequences to be used in demos or simulated neuroscience experiments. This paper describes the range of tools and stimuli that it provides and the environment in which experiments are conducted.

  3. Project Management Software for Distributed Industrial Companies

    NASA Astrophysics Data System (ADS)

    Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.

    This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.

  4. Design of an activity landscape view taking compound-based feature probabilities into account.

    PubMed

    Zhang, Bijun; Vogt, Martin; Bajorath, Jürgen

    2014-09-01

    Activity landscapes (ALs) of compound data sets are rationalized as graphical representations that integrate similarity and potency relationships between active compounds. ALs enable the visualization of structure-activity relationship (SAR) information and are thus computational tools of interest for medicinal chemistry. For AL generation, similarity and potency relationships are typically evaluated in a pairwise manner and major AL features are assessed at the level of compound pairs. In this study, we add a conditional probability formalism to AL design that makes it possible to quantify the probability of individual compounds to contribute to characteristic AL features. Making this information graphically accessible in a molecular network-based AL representation is shown to further increase AL information content and helps to quickly focus on SAR-informative compound subsets. This feature probability-based AL variant extends the current spectrum of AL representations for medicinal chemistry applications.

  5. DOT2: Macromolecular Docking With Improved Biophysical Models

    PubMed Central

    Roberts, Victoria A.; Thompson, Elaine E.; Pique, Michael E.; Perez, Martin S.; Eyck, Lynn Ten

    2015-01-01

    Computational docking is a useful tool for predicting macromolecular complexes, which are often difficult to determine experimentally. Here we present the DOT2 software suite, an updated version of the DOT intermolecular docking program. DOT2 provides straightforward, automated construction of improved biophysical models based on molecular coordinates, offering checkpoints that guide the user to include critical features. DOT has been updated to run more quickly, allow flexibility in grid size and spacing, and generate a complete list of favorable candidate configu-rations. Output can be filtered by experimental data and rescored by the sum of electrostatic and atomic desolvation energies. We show that this rescoring method improves the ranking of correct complexes for a wide range of macromolecular interactions, and demonstrate that biologically relevant models are essential for biologically relevant results. The flexibility and versatility of DOT2 accommodate realistic models of complex biological systems, improving the likelihood of a successful docking outcome. PMID:23695987

  6. miRNAFold: a web server for fast miRNA precursor prediction in genomes.

    PubMed

    Tav, Christophe; Tempel, Sébastien; Poligny, Laurent; Tahi, Fariza

    2016-07-08

    Computational methods are required for prediction of non-coding RNAs (ncRNAs), which are involved in many biological processes, especially at post-transcriptional level. Among these ncRNAs, miRNAs have been largely studied and biologists need efficient and fast tools for their identification. In particular, ab initio methods are usually required when predicting novel miRNAs. Here we present a web server dedicated for miRNA precursors identification at a large scale in genomes. It is based on an algorithm called miRNAFold that allows predicting miRNA hairpin structures quickly with high sensitivity. miRNAFold is implemented as a web server with an intuitive and user-friendly interface, as well as a standalone version. The web server is freely available at: http://EvryRNA.ibisc.univ-evry.fr/miRNAFold. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  8. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate temporal evolution of the magnetic field structure and/or fast electron population implied by the electron acceleration and transport. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.

  9. Development of 3-Year Roadmap to Transform the Discipline of Systems Engineering

    DTIC Science & Technology

    2010-03-31

    quickly humans could physically construct them. Indeed, magnetic core memory was entirely constructed by human hands until it was superseded by...For their mainframe computers, IBM develops the applications, operating system, computer hardware and microprocessors (off the shelf standard memory ...processor developers work on potential computational and memory pipelines to support the required performance capabilities and use the available transistors

  10. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    ERIC Educational Resources Information Center

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  11. Topside Electron Density Representations for Middle and High Latitudes: A Topside Parameterization for E-CHAIM Based On the NeQuick

    NASA Astrophysics Data System (ADS)

    Themens, David R.; Jayachandran, P. T.; Bilitza, Dieter; Erickson, Philip J.; Häggström, Ingemar; Lyashenko, Mykhaylo V.; Reid, Benjamin; Varney, Roger H.; Pustovalova, Ljubov

    2018-02-01

    In this study, we present a topside model representation to be used by the Empirical Canadian High Arctic Ionospheric Model (E-CHAIM). In the process of this, we also present a comprehensive evaluation of the NeQuick's, and by extension the International Reference Ionosphere's, topside electron density model for middle and high latitudes in the Northern Hemisphere. Using data gathered from all available incoherent scatter radars, topside sounders, and Global Navigation Satellite System Radio Occultation satellites, we show that the current NeQuick parameterization suboptimally represents the shape of the topside electron density profile at these latitudes and performs poorly in the representation of seasonal and solar cycle variations of the topside scale thickness. Despite this, the simple, one variable, NeQuick model is a powerful tool for modeling the topside ionosphere. By refitting the parameters that define the maximum topside scale thickness and the rate of increase of the scale height within the NeQuick topside model function, r and g, respectively, and refitting the model's parameterization of the scale height at the F region peak, H0, we find considerable improvement in the NeQuick's ability to represent the topside shape and behavior. Building on these results, we present a new topside model extension of the E-CHAIM based on the revised NeQuick function. Overall, root-mean-square errors in topside electron density are improved over the traditional International Reference Ionosphere/NeQuick topside by 31% for a new NeQuick parameterization and by 36% for a newly proposed topside for E-CHAIM.

  12. Coaxial cable stripper for confined areas

    NASA Technical Reports Server (NTRS)

    Brown, J. D.; Lipscomb, W. G.

    1968-01-01

    Manual coaxial cable stripper quickly and accurately prepares a coaxial cable in a confined area. With this tool, preparation time is greatly reduced, and a completely inexperienced technician can perform the operation.

  13. AVERT Main Module Quick Start Guide

    EPA Pesticide Factsheets

    Learn how to get started with the AVERT tool, which guides non-experts in evaluating county-level emissions displaced at electric power plants by energy efficiency and renewable energy policies and programs.

  14. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    NASA Astrophysics Data System (ADS)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  15. Variation simulation for compliant sheet metal assemblies with applications

    NASA Astrophysics Data System (ADS)

    Long, Yufeng

    Sheet metals are widely used in discrete products, such as automobiles, aircraft, furniture and electronics appliances, due to their good manufacturability and low cost. A typical automotive body assembly consists of more than 300 parts welded together in more than 200 assembly fixture stations. Such an assembly system is usually quite complex, and takes a long time to develop. As the automotive customer demands products of increasing quality in a shorter time, engineers in automotive industry turn to computer-aided engineering (CAE) tools for help. Computers are an invaluable resource for engineers, not only to simplify and automate the design process, but also to share design specifications with manufacturing groups so that production systems can be tooled up quickly and efficiently. Therefore, it is beneficial to develop computerized simulation and evaluation tools for development of automotive body assembly systems. It is a well-known fact that assembly architectures (joints, fixtures, and assembly lines) have a profound impact on dimensional quality of compliant sheet metal assemblies. To evaluate sheet metal assembly architectures, a special dimensional analysis tool need be developed for predicting dimensional variation of the assembly. Then, the corresponding systematic tools can be established to help engineers select the assembly architectures. In this dissertation, a unified variation model is developed to predict variation in compliant sheet metal assemblies by considering fixture-induced rigid-body motion, deformation and springback. Based on the unified variation model, variation propagation models in multiple assembly stations with various configurations are established. To evaluate the dimensional capability of assembly architectures, quantitative indices are proposed based on the sensitivity matrix, which are independent of the variation level of the process. Examples are given to demonstrate their applications in selecting robust assembly architectures, and some useful guidelines for selection of assembly architectures are summarized. In addition, to enhance the fault diagnosis, a systematic methodology is proposed for selection of measurement configurations. Specifically, principles involved in selecting measurements are generalized first; then, the corresponding quantitative indices are developed to evaluate the measurement configurations, and finally, examples are present.

  16. The development and reliability of a simple field based screening tool to assess core stability in athletes.

    PubMed

    O'Connor, S; McCaffrey, N; Whyte, E; Moran, K

    2016-07-01

    To adapt the trunk stability test to facilitate further sub-classification of higher levels of core stability in athletes for use as a screening tool. To establish the inter-tester and intra-tester reliability of this adapted core stability test. Reliability study. Collegiate athletic therapy facilities. Fifteen physically active male subjects (19.46 ± 0.63) free from any orthopaedic or neurological disorders were recruited from a convenience sample of collegiate students. The intraclass correlation coefficients (ICC) and 95% Confidence Intervals (CI) were computed to establish inter-tester and intra-tester reliability. Excellent ICC values were observed in the adapted core stability test for inter-tester reliability (0.97) and good to excellent intra-tester reliability (0.73-0.90). While the 95% CI were narrow for inter-tester reliability, Tester A and C 95% CI's were widely distributed compared to Tester B. The adapted core stability test developed in this study is a quick and simple field based test to administer that can further subdivide athletes with high levels of core stability. The test demonstrated high inter-tester and intra-tester reliability. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Easy and Fast Reconstruction of a 3D Avatar with an RGB-D Sensor.

    PubMed

    Mao, Aihua; Zhang, Hong; Liu, Yuxin; Zheng, Yinglong; Li, Guiqing; Han, Guoqiang

    2017-05-12

    This paper proposes a new easy and fast 3D avatar reconstruction method using an RGB-D sensor. Users can easily implement human body scanning and modeling just with a personal computer and a single RGB-D sensor such as a Microsoft Kinect within a small workspace in their home or office. To make the reconstruction of 3D avatars easy and fast, a new data capture strategy is proposed for efficient human body scanning, which captures only 18 frames from six views with a close scanning distance to fully cover the body; meanwhile, efficient alignment algorithms are presented to locally align the data frames in the single view and then globally align them in multi-views based on pairwise correspondence. In this method, we do not adopt shape priors or subdivision tools to synthesize the model, which helps to reduce modeling complexity. Experimental results indicate that this method can obtain accurate reconstructed 3D avatar models, and the running performance is faster than that of similar work. This research offers a useful tool for the manufacturers to quickly and economically create 3D avatars for products design, entertainment and online shopping.

  18. Tracking Blade Tip Vortices for Numerical Flow Simulations of Hovering Rotorcraft

    NASA Technical Reports Server (NTRS)

    Kao, David L.

    2016-01-01

    Blade tip vortices generated by a helicopter rotor blade are a major source of rotor noise and airframe vibration. This occurs when a vortex passes closely by, and interacts with, a rotor blade. The accurate prediction of Blade Vortex Interaction (BVI) continues to be a challenge for Computational Fluid Dynamics (CFD). Though considerable research has been devoted to BVI noise reduction and experimental techniques for measuring the blade tip vortices in a wind tunnel, there are only a handful of post-processing tools available for extracting vortex core lines from CFD simulation data. In order to calculate the vortex core radius, most of these tools require the user to manually select a vortex core to perform the calculation. Furthermore, none of them provide the capability to track the growth of a vortex core, which is a measure of how quickly the vortex diffuses over time. This paper introduces an automated approach for tracking the core growth of a blade tip vortex from CFD simulations of rotorcraft in hover. The proposed approach offers an effective method for the quantification and visualization of blade tip vortices in helicopter rotor wakes. Keywords: vortex core, feature extraction, CFD, numerical flow visualization

  19. Students Soaring High with Software Spinoff

    NASA Technical Reports Server (NTRS)

    2004-01-01

    An educational software product designed by the Educational Technology Team at Ames Research Center is bringing actual aeronautical work performed by NASA engineers to the public in an interactive format for the very first time, in order to introduce future generations of engineers to the fundamentals of flight. The "Exploring Aeronautics" multimedia CD-ROM was created for use by teachers of students in grades 5 through 8. The software offers an introduction to aeronautics and covers the fundamentals of flight, including how airplanes take off, fly, and land. It contains a historical timeline and a glossary of aeronautical terms, examines different types of aircraft, and familiarizes its audience with the tools used by researchers to test aircraft designs, like wind tunnels and computational fluid dynamics. "Exploring Aeronautics" was done in cartoon animation to make it appealing to kids," notes Andrew Doser, an Ames graphic artist who helped to produce the CD-ROM, along with a team of multimedia programmers, artists, and educators, in conjunction with numerous Ames scientists. In addition to lively animation, the software features QuickTime movies and highly intuitive tools to promote usage of NASA s scientific methods in the world of aeronautics.

  20. Active flow control insight gained from a modified integral boundary layer equation

    NASA Astrophysics Data System (ADS)

    Seifert, Avraham

    2016-11-01

    Active Flow Control (AFC) can alter the development of boundary layers with applications (e.g., reducing drag by separation delay or separating the boundary layers and enhancing vortex shedding to increase drag). Historically, significant effects of steady AFC methods were observed. Unsteady actuation is significantly more efficient than steady. Full-scale AFC tests were conducted with varying levels of success. While clearly relevant to industry, AFC implementation relies on expert knowledge with proven intuition and or costly and lengthy computational efforts. This situation hinders the use of AFC while simple, quick and reliable design method is absent. An updated form of the unsteady integral boundary layer (UIBL) equations, that include AFC terms (unsteady wall transpiration and body forces) can be used to assist in AFC analysis and design. With these equations and given a family of suitable velocity profiles, the momentum thickness can be calculated and matched with an outer, potential flow solution in 2D and 3D manner to create an AFC design tool, parallel to proven tools for airfoil design. Limiting cases of the UIBL equation can be used to analyze candidate AFC concepts in terms of their capability to modify the boundary layers development and system performance.

  1. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  2. OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments

    NASA Astrophysics Data System (ADS)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.

  3. Real-Time Visualization of Spacecraft Telemetry for the GLAST and LRO Missions

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric T.; Shah, Neerav; Chai, Dean J.

    2010-01-01

    GlastCam and LROCam are closely-related tools developed at NASA Goddard Space Flight Center for real-time visualization of spacecraft telemetry, developed for the Gamma-Ray Large Area Space Telescope (GLAST) and Lunar Reconnaissance Orbiter (LRO) missions, respectively. Derived from a common simulation tool, they use related but different architectures to ingest real-time spacecraft telemetry and ground predicted ephemerides, and to compute and display features of special interest to each mission in its operational environment. We describe the architectures of GlastCam and LROCam, the customizations required to fit into the mission operations environment, and the features that were found to be especially useful in early operations for their respective missions. Both tools have a primary window depicting a three-dimensional Cam view of the spacecraft that may be freely manipulated by the user. The scene is augmented with fields of view, pointing constraints, and other features which enhance situational awareness. Each tool also has another "Map" window showing the spacecraft's groundtrack projected onto a map of the Earth or Moon, along with useful features such as the Sun, eclipse regions, and TDRS satellite locations. Additional windows support specialized checkout tasks. One such window shows the star tracker fields of view, with tracking window locations and the mission star catalog. This view was instrumental for GLAST in quickly resolving a star tracker mounting polarity issue; visualization made the 180-deg mismatch immediately obvious. Full access to GlastCam's source code also made possible a rapid coarse star tracker mounting calibration with some on the fly code adjustments; adding a fine grid to measure alignment offsets, and introducing a calibration quaternion which could be adjusted within GlastCam without perturbing the flight parameters. This calibration, from concept to completion, took less than half an hour. Both GlastCam and LROCam were developed in the C language, with non-proprietary support libraries, for ease of customization and portability. This no-blackboxes aspect enables engineers to adapt quickly to unforeseen circumstances in the intense operations environment. GlastCam and LROCam were installed on multiple workstations in the operations support rooms, allowing independent use by multiple subsystems, systems engineers and managers, with negligible draw on telemetry system resources.

  4. Formatting scripts with computers and Extended BASIC.

    PubMed

    Menning, C B

    1984-02-01

    A computer program, written in the language of Extended BASIC, is presented which enables scripts, for educational media, to be quickly written in a nearly unformatted style. From the resulting script file, stored on magnetic tape or disk, the computer program formats the script into either a storyboard , a presentation, or a narrator 's script. Script headings and page and paragraph numbers are automatic features in the word processing. Suggestions are given for making personal modifications to the computer program.

  5. The Mark III Hypercube-Ensemble Computers

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Tuazon, Jesus O.; Lieberman, Don; Pniel, Moshe

    1988-01-01

    Mark III Hypercube concept applied in development of series of increasingly powerful computers. Processor of each node of Mark III Hypercube ensemble is specialized computer containing three subprocessors and shared main memory. Solves problem quickly by simultaneously processing part of problem at each such node and passing combined results to host computer. Disciplines benefitting from speed and memory capacity include astrophysics, geophysics, chemistry, weather, high-energy physics, applied mechanics, image processing, oil exploration, aircraft design, and microcircuit design.

  6. Aviation Careers Series: Airport Careers

    DOT National Transportation Integrated Search

    1996-01-29

    Intelligent Transportation Systems (ITS) offer tools to address transportation safety on several fronts, including improving control of the vehicle, mitigating circumstances that contribute to crashes, and responding more quickly to crashes that do o...

  7. Precision laser aiming system

    DOEpatents

    Ahrens, Brandon R [Albuquerque, NM; Todd, Steven N [Rio Rancho, NM

    2009-04-28

    A precision laser aiming system comprises a disrupter tool, a reflector, and a laser fixture. The disrupter tool, the reflector and the laser fixture are configurable for iterative alignment and aiming toward an explosive device threat. The invention enables a disrupter to be quickly and accurately set up, aligned, and aimed in order to render safe or to disrupt a target from a standoff position.

  8. History Educators and the Challenge of Immersive Pasts: A Critical Review of Virtual Reality "Tools" and History Pedagogy

    ERIC Educational Resources Information Center

    Allison, John

    2008-01-01

    This paper will undertake a critical review of the impact of virtual reality tools on the teaching of history. Virtual reality is useful in several different ways. History educators, elementary and secondary school teachers and professors, can all profit from the digital environment. Challenges arise quickly however. Virtual reality technologies…

  9. The utility of postmortem computed tomography selective coronary angiography in parallel with autopsy.

    PubMed

    Inokuchi, Go; Yajima, Daisuke; Hayakawa, Mutsumi; Motomura, Ayumi; Chiba, Fumiko; Torimitsu, Suguru; Makino, Yohsuke; Iwase, Hirotaro

    2013-12-01

    Historically, coronary angiography of the isolated heart has played an important role in the detection of stenotic or occlusive lesions that are difficult to identify by autopsy alone. Meanwhile, although the application of multidetector computed tomography (MDCT) to forensic fields has accelerated recently, isolated single organ angiography with MDCT is rarely performed. In this article, we present an evaluation of postmortem selective coronary CT angiography of the isolated heart with MDCT and discuss its utility for autopsy. First, in a preliminary experiment using pig coronary artery, we examined the behavior of water soluble contrast material on postmortem computed tomography angiography (PMCTA) and found that better angiographic images were acquired when the viscosity of the contrast material was increased and CT was performed under conditions of sustained perfusion. Based on these results, we devised a selective coronary angiography procedure using a pressurized bag for drip infusion that can be performed easily, quickly, and at low cost. The angiographic images obtained provided useful supportive evidence of autopsy findings suggestive of ischemic heart disease. With active discussions underway in forensic fields on the proper use of postmortem computed tomography, PMCTA has also naturally attracted attention as it compensates for some of the shortcomings of CT alone. Although PMCTA typically involves whole-body angiography, if we view PMCTA as one of the many useful and supplementary tools available for autopsy, then isolated heart angiography continues to have utility in autopsy today.

  10. Using wireless handheld computers to seek information at the point of care: an evaluation by clinicians.

    PubMed

    Hauser, Susan E; Demner-Fushman, Dina; Jacobs, Joshua L; Humphrey, Susanne M; Ford, Glenn; Thoma, George R

    2007-01-01

    To evaluate: (1) the effectiveness of wireless handheld computers for online information retrieval in clinical settings; (2) the role of MEDLINE in answering clinical questions raised at the point of care. A prospective single-cohort study: accompanying medical teams on teaching rounds, five internal medicine residents used and evaluated MD on Tap, an application for handheld computers, to seek answers in real time to clinical questions arising at the point of care. All transactions were stored by an intermediate server. Evaluators recorded clinical scenarios and questions, identified MEDLINE citations that answered the questions, and submitted daily and summative reports of their experience. A senior medical librarian corroborated the relevance of the selected citation to each scenario and question. Evaluators answered 68% of 363 background and foreground clinical questions during rounding sessions using a variety of MD on Tap features in an average session length of less than four minutes. The evaluator, the number and quality of query terms, the total number of citations found for a query, and the use of auto-spellcheck significantly contributed to the probability of query success. Handheld computers with Internet access are useful tools for healthcare providers to access MEDLINE in real time. MEDLINE citations can answer specific clinical questions when several medical terms are used to form a query. The MD on Tap application is an effective interface to MEDLINE in clinical settings, allowing clinicians to quickly find relevant citations.

  11. Using Wireless Handheld Computers to Seek Information at the Point of Care: An Evaluation by Clinicians

    PubMed Central

    Hauser, Susan E.; Demner-Fushman, Dina; Jacobs, Joshua L.; Humphrey, Susanne M.; Ford, Glenn; Thoma, George R.

    2007-01-01

    Objective To evaluate: (1) the effectiveness of wireless handheld computers for online information retrieval in clinical settings; (2) the role of MEDLINE® in answering clinical questions raised at the point of care. Design A prospective single-cohort study: accompanying medical teams on teaching rounds, five internal medicine residents used and evaluated MD on Tap, an application for handheld computers, to seek answers in real time to clinical questions arising at the point of care. Measurements All transactions were stored by an intermediate server. Evaluators recorded clinical scenarios and questions, identified MEDLINE citations that answered the questions, and submitted daily and summative reports of their experience. A senior medical librarian corroborated the relevance of the selected citation to each scenario and question. Results Evaluators answered 68% of 363 background and foreground clinical questions during rounding sessions using a variety of MD on Tap features in an average session length of less than four minutes. The evaluator, the number and quality of query terms, the total number of citations found for a query, and the use of auto-spellcheck significantly contributed to the probability of query success. Conclusion Handheld computers with Internet access are useful tools for healthcare providers to access MEDLINE in real time. MEDLINE citations can answer specific clinical questions when several medical terms are used to form a query. The MD on Tap application is an effective interface to MEDLINE in clinical settings, allowing clinicians to quickly find relevant citations. PMID:17712085

  12. Virtual reality hardware for use in interactive 3D data fusion and visualization

    NASA Astrophysics Data System (ADS)

    Gourley, Christopher S.; Abidi, Mongi A.

    1997-09-01

    Virtual reality has become a tool for use in many areas of research. We have designed and built a VR system for use in range data fusion and visualization. One major VR tool is the CAVE. This is the ultimate visualization tool, but comes with a large price tag. Our design uses a unique CAVE whose graphics are powered by a desktop computer instead of a larger rack machine making it much less costly. The system consists of a screen eight feet tall by twenty-seven feet wide giving a variable field-of-view currently set at 160 degrees. A silicon graphics Indigo2 MaxImpact with the impact channel option is used for display. This gives the capability to drive three projectors at a resolution of 640 by 480 for use in displaying the virtual environment and one 640 by 480 display for a user control interface. This machine is also the first desktop package which has built-in hardware texture mapping. This feature allows us to quickly fuse the range and intensity data and other multi-sensory data. The final goal is a complete 3D texture mapped model of the environment. A dataglove, magnetic tracker, and spaceball are to be used for manipulation of the data and navigation through the virtual environment. This system gives several users the ability to interactively create 3D models from multiple range images.

  13. Exposure assessment in different occupational groups at a hospital using Quick Exposure Check (QEC) - a pilot study.

    PubMed

    Ericsson, Pernilla; Björklund, Martin; Wahlström, Jens

    2012-01-01

    In order to test the feasibility and sensitivity of the ergonomic exposure assessment tool Quick Exposure Check (QEC), a pilot-study was conducted. The aim was to test QEC in different occupational groups to compare the exposure in the most common work task with the exposure in the work task perceived as the most strenuous for the neck/shoulder region, and to test intra-observer reliability. One experienced ergonomist observed 23 workers. The mean observation time was 45 minutes, waiting time and time for complementary questions included. The exposure scores varied between the different occupational groups as well as between workers within the occupational groups. Eighteen workers rated their most common work task as also being the most strenuous for the neck/shoulder region. For the remaining five workers, the mean exposure score were higher both for the neck and shoulder/arm in the most common work task. Intra-observer reliability shows agreement in 86% of the exposure interactions in the neck and in 71% in the shoulder/arm. QEC seems to fulfill the expectations of being a quick, sensible and practical exposure assessment tool that covers physical risk factors in the neck, upper extremities and low back.

  14. HyperCard for Educators. An Introduction.

    ERIC Educational Resources Information Center

    Bull, Glen L.; Harris, Judi

    This guide is designed to provide a quick introduction to the basic elements of HyperCard for teachers who are familiar with other computer applications but may not have worked with hypermedia applications; previous familiarity with HyperCard or with Macintosh computers is not necessary. It is noted that HyperCard is a software construction…

  15. Health Informatics Program Design and Outcomes: Learning from an Early Offering at a Mid-Level University

    ERIC Educational Resources Information Center

    Parker, Kevin R.; Srinivasan, Sankara Subramanian; Houghton, Robert F.; Kordzadeh, Nima; Bozan, Karoly; Ottaway, Thomas; Davey, Bill

    2017-01-01

    Curriculum development is particularly challenging in computing-related disciplines as the computing industry changes more quickly than most. As information technology degrees have become relatively pervasive, some institutions that offer information systems degrees have recognized a need to develop specialist studies in information systems. This…

  16. Computer assisted yarding cost analysis.

    Treesearch

    Ronald W. Mifflin

    1980-01-01

    Programs for a programable calculator and a desk-top computer are provided for quickly determining yarding cost and comparing the economics of alternative yarding systems. The programs emphasize the importance of the relationship between production rate and machine rate, which is the hourly cost of owning and operating yarding equipment. In addition to generating the...

  17. Bilingualism in the Computer Age. 1987-88. OREA Evaluation Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Alvarez, Rosalyn

    Bilingualism in the Computer Age, a federally-funded bilingual education program at Morris High School in the Bronx (New York), served 197 native low-income Spanish-speaking students in its second year of funding. Program objectives were to improve students' English language proficiency and mainstream them as quickly as possible, develop their…

  18. Difference-Equation/Flow-Graph Circuit Analysis

    NASA Technical Reports Server (NTRS)

    Mcvey, I. M.

    1988-01-01

    Numerical technique enables rapid, approximate analyses of electronic circuits containing linear and nonlinear elements. Practiced in variety of computer languages on large and small computers; for circuits simple enough, programmable hand calculators used. Although some combinations of circuit elements make numerical solutions diverge, enables quick identification of divergence and correction of circuit models to make solutions converge.

  19. Film Library Information Management System.

    ERIC Educational Resources Information Center

    Minnella, C. Vincent; And Others

    The computer program described not only allows the user to determine rental sources for a particular film title quickly, but also to select the least expensive of the sources. This program developed at SUNY Cortland's Sperry Learning Resources Center and Computer Center is designed to maintain accurate data on rental and purchase films in both…

  20. Integrated software system for low level waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worku, G.

    1995-12-31

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less

  1. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  2. Cost Function Network-based Design of Protein-Protein Interactions: predicting changes in binding affinity.

    PubMed

    Viricel, Clément; de Givry, Simon; Schiex, Thomas; Barbe, Sophie

    2018-02-20

    Accurate and economic methods to predict change in protein binding free energy upon mutation are imperative to accelerate the design of proteins for a wide range of applications. Free energy is defined by enthalpic and entropic contributions. Following the recent progresses of Artificial Intelligence-based algorithms for guaranteed NP-hard energy optimization and partition function computation, it becomes possible to quickly compute minimum energy conformations and to reliably estimate the entropic contribution of side-chains in the change of free energy of large protein interfaces. Using guaranteed Cost Function Network algorithms, Rosetta energy functions and Dunbrack's rotamer library, we developed and assessed EasyE and JayZ, two methods for binding affinity estimation that ignore or include conformational entropic contributions on a large benchmark of binding affinity experimental measures. If both approaches outperform most established tools, we observe that side-chain conformational entropy brings little or no improvement on most systems but becomes crucial in some rare cases. as open-source Python/C ++ code at sourcesup.renater.fr/projects/easy-jayz. thomas.schiex@inra.fr and sophie.barbe@insa-toulouse.fr. Supplementary data are available at Bioinformatics online.

  3. Reduced-order surrogate models for Green's functions in black hole spacetimes

    NASA Astrophysics Data System (ADS)

    Galley, Chad; Wardell, Barry

    2016-03-01

    The fundamental nature of linear wave propagation in curved spacetime is encoded in the retarded Green's function (or propagator). Green's functions are useful tools because almost any field quantity of interest can be computed via convolution integrals with a source. In addition, perturbation theories involving nonlinear wave propagation can be expressed in terms of multiple convolutions of the Green's function. Recently, numerical solutions for propagators in black hole spacetimes have been found that are globally valid and accurate for computing physical quantities. However, the data generated is too large for practical use because the propagator depends on two spacetime points that must be sampled finely to yield accurate convolutions. I describe how to build a reduced-order model that can be evaluated as a substitute, or surrogate, for solutions of the curved spacetime Green's function equation. The resulting surrogate accurately and quickly models the original and out-of-sample data. I discuss applications of the surrogate, including self-consistent evolutions and waveforms of extreme mass ratio binaries. Green's function surrogate models provide a new and practical way to handle many old problems involving wave propagation and motion in curved spacetimes.

  4. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  5. Video-processing-based system for automated pedestrian data collection and analysis when crossing the street

    NASA Astrophysics Data System (ADS)

    Mansouri, Nabila; Watelain, Eric; Ben Jemaa, Yousra; Motamed, Cina

    2018-03-01

    Computer-vision techniques for pedestrian detection and tracking have progressed considerably and become widely used in several applications. However, a quick glance at the literature shows a minimal use of these techniques in pedestrian behavior and safety analysis, which might be due to the technical complexities facing the processing of pedestrian videos. To extract pedestrian trajectories from a video automatically, all road users must be detected and tracked during sequences, which is a challenging task, especially in a congested open-outdoor urban space. A multipedestrian tracker based on an interframe-detection-association process was proposed and evaluated. The tracker results are used to implement an automatic tool for pedestrians data collection when crossing the street based on video processing. The variations in the instantaneous speed allowed the detection of the street crossing phases (approach, waiting, and crossing). These were addressed for the first time in the pedestrian road security analysis to illustrate the causal relationship between pedestrian behaviors in the different phases. A comparison with a manual data collection method, by computing the root mean square error and the Pearson correlation coefficient, confirmed that the procedures proposed have significant potential to automate the data collection process.

  6. Clinical experience with CT colonography

    NASA Astrophysics Data System (ADS)

    Reed, Judd E.; Garry, John L.; Wilson, Lynn A.; Johnson, C. Daniel

    2000-04-01

    Since the introduction of Computed Tomographic Colonography (CTC) in 1995, many advances in computer equipment and software have become available. Despite these advances, the promise of colon cancer prevention has not been realized. A colorectal screening tool that performs at a high level, is acceptable to patients, and can be performed safely and at low cost holds promise of saving lives in the future. Our institution has performed over two hundred seventy five clinical CTC examinations. These scans, which each entail a supine and a prone acquisition, only differ from our research protocol in the necessity of an expeditious interpretation. Patients arrive for their CTC examination early in the morning following a period of fasting and bowel preparation. If a CTC examination has a positive finding, the patient is scheduled for colonoscopic polypectomy that same morning. To facilitate this, the patients are required to continue fasting until the CTC examination has been interpreted. It is therefore necessary to process the CTC examination very quickly to minimize patient discomfort. A positive CTC result occurred in fifteen percent of examinations. Among these positive results, the specificity has been in excess of ninety five percent. Additionally, life threatening extra-colonic lesions were discovered in two percent of the screened population.

  7. Electrical safety device

    DOEpatents

    White, David B.

    1991-01-01

    An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.

  8. Water Budget Quick Start Guide

    EPA Pesticide Factsheets

    WaterSense created the Water Budget Tool as one option to help builders, landscape professionals, and irrigation professionals certified by a WaterSense labeled program meet the criteria specified in the WaterSense New Home Specification.

  9. Precise Restraightening of Bent Studs

    NASA Technical Reports Server (NTRS)

    Boardman, R. E.

    1982-01-01

    Special tool quickly bends studs back into shape accurately and safely by force applied by hydraulic ram, with deflection being measured by dial indicator. Ram and indicator can be interchanged for straightening in reverse direction.

  10. National PKU News

    MedlinePlus

    ... Enter your email for our monthly newsletter * Need books? Our pocket books are great reference tools for caregivers, relatives, and ... quick reference in your desk at work. Baby Books are provided free to all newborn PKU patients ...

  11. The development of a digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey Lindsay

    Instructors in electrical and computer engineering and in computer science have developed innovative methods to teach digital logic circuits. These methods attempt to increase student learning, satisfaction, and retention. Although there are readily accessible and accepted means for measuring satisfaction and retention, there are no widely accepted means for assessing student learning. Rigorous assessment of learning is elusive because differences in topic coverage, curriculum and course goals, and exam content prevent direct comparison of two teaching methods when using tools such as final exam scores or course grades. Because of these difficulties, computing educators have issued a general call for the adoption of assessment tools to critically evaluate and compare the various teaching methods. Science, Technology, Engineering, and Mathematics (STEM) education researchers commonly measure students' conceptual learning to compare how much different pedagogies improve learning. Conceptual knowledge is often preferred because all engineering courses should teach a fundamental set of concepts even if they emphasize design or analysis to different degrees. Increasing conceptual learning is also important, because students who can organize facts and ideas within a consistent conceptual framework are able to learn new information quickly and can apply what they know in new situations. If instructors can accurately assess their students' conceptual knowledge, they can target instructional interventions to remedy common problems. To properly assess conceptual learning, several researchers have developed concept inventories (CIs) for core subjects in engineering sciences. CIs are multiple-choice assessment tools that evaluate how well a student's conceptual framework matches the accepted conceptual framework of a discipline or common faulty conceptual frameworks. We present how we created and evaluated the digital logic concept inventory (DLCI).We used a Delphi process to identify the important and difficult concepts to include on the DLCI. To discover and describe common student misconceptions, we interviewed students who had completed a digital logic course. Students vocalized their thoughts as they solved digital logic problems. We analyzed the interview data using a qualitative grounded theory approach. We have administered the DLCI at several institutions and have checked the validity, reliability, and bias of the DLCI with classical testing theory procedures. These procedures consisted of follow-up interviews with students, analysis of administration results with statistical procedures, and expert feedback. We discuss these results and present the DLCI's potential for providing a meaningful tool for comparing student learning at different institutions.

  12. GSAC - Generic Seismic Application Computing

    NASA Astrophysics Data System (ADS)

    Herrmann, R. B.; Ammon, C. J.; Koper, K. D.

    2004-12-01

    With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient manipulation of SAC files under a variety of operating systems. PySAC has proven to be valuable in organizing large data sets. An array processing package includes standard beamforming algorithms and a search based method for inference of slowness vectors. The search results can be visualized using GMT scripts output by the C programs, and the resulting snapshots can be combined into an animation of the time evolution of the 2D slowness field.

  13. minimega v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crussell, Jonathan; Erickson, Jeremy; Fritz, David

    minimega is an emulytics platform for creating testbeds of networked devices. The platoform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. minimega allows experiments to be brought up quickly with almost no configuration. minimega also includes tools for simple cluster, management, as well as tools for creating Linux-based virtual machines. This release of minimega includes new emulated sensors for Android devices to improve the fidelity of testbeds that include mobile devices. Emulated sensors include GPS and

  14. Automated Topographic Change Detection via Dem Differencing at Large Scales Using The Arcticdem Database

    NASA Astrophysics Data System (ADS)

    Candela, S. G.; Howat, I.; Noh, M. J.; Porter, C. C.; Morin, P. J.

    2016-12-01

    In the last decade, high resolution satellite imagery has become an increasingly accessible tool for geoscientists to quantify changes in the Arctic land surface due to geophysical, ecological and anthropomorphic processes. However, the trade off between spatial coverage and spatial-temporal resolution has limited detailed, process-level change detection over large (i.e. continental) scales. The ArcticDEM project utilized over 300,000 Worldview image pairs to produce a nearly 100% coverage elevation model (above 60°N) offering the first polar, high spatial - high resolution (2-8m by region) dataset, often with multiple repeats in areas of particular interest to geo-scientists. A dataset of this size (nearly 250 TB) offers endless new avenues of scientific inquiry, but quickly becomes unmanageable computationally and logistically for the computing resources available to the average scientist. Here we present TopoDiff, a framework for a generalized. automated workflow that requires minimal input from the end user about a study site, and utilizes cloud computing resources to provide a temporally sorted and differenced dataset, ready for geostatistical analysis. This hands-off approach allows the end user to focus on the science, without having to manage thousands of files, or petabytes of data. At the same time, TopoDiff provides a consistent and accurate workflow for image sorting, selection, and co-registration enabling cross-comparisons between research projects.

  15. Aspects of perturbation theory in quantum mechanics: The BenderWuMATHEMATICA® package

    NASA Astrophysics Data System (ADS)

    Sulejmanpasic, Tin; Ünsal, Mithat

    2018-07-01

    We discuss a general setup which allows the study of the perturbation theory of an arbitrary, locally harmonic 1D quantum mechanical potential as well as its multi-variable (many-body) generalization. The latter may form a prototype for regularized quantum field theory. We first generalize the method of Bender-Wu,and derive exact recursion relations which allow the determination of the perturbative wave-function and energy corrections to an arbitrary order, at least in principle. For 1D systems, we implement these equations in an easy to use MATHEMATICA® package we call BenderWu. Our package enables quick home-computer computation of high orders of perturbation theory (about 100 orders in 10-30 s, and 250 orders in 1-2 h) and enables practical study of a large class of problems in Quantum Mechanics. We have two hopes concerning the BenderWu package. One is that due to resurgence, large amount of non-perturbative information, such as non-perturbative energies and wave-functions (e.g. WKB wave functions), can in principle be extracted from the perturbative data. We also hope that the package may be used as a teaching tool, providing an effective bridge between perturbation theory and non-perturbative physics in textbooks. Finally, we show that for the multi-variable case, the recursion relation acquires a geometric character, and has a structure which allows parallelization to computer clusters.

  16. Hydrological analysis in R: Topmodel and beyond

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Reusser, D.

    2011-12-01

    R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.

  17. New technologies for advanced three-dimensional optimum shape design in aeronautics

    NASA Astrophysics Data System (ADS)

    Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno

    1999-05-01

    The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright

  18. BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation

    PubMed Central

    Kiefer, Christina; Fehlmann, Tobias; Backes, Christina

    2017-01-01

    Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498

  19. Abdominal CT scan

    MedlinePlus

    Computed tomography scan - abdomen; CT scan - abdomen; CT abdomen and pelvis ... An abdominal CT scan makes detailed pictures of the structures inside your belly very quickly. This test may be used to look ...

  20. Computing | Home

    Science.gov Websites

    Book Newsroom Newsroom News and features Press releases Photo gallery Fact sheets and brochures Media Quick Links Home Contact Phone Book Fermilab at Work For Industry Jobs Interact Facebook Twitter

  1. The CCTC Quick-Reacting General War Gaming System (QUICK) Program Maintenance Manual. Volume I. Data Management Subsystem. Change 3.

    DTIC Science & Technology

    1980-05-22

    cross -referenced with the number of the data transaction listed in the data module quality con- trol list NVB Integer variable used to...Organization of the Joint Chiefs of Staff. Technical support was provided by System Sciences, Incorporated under Contract Number DCA100-75-C-0019. Change set... Contract Number DCA 100-75-C-0019. Change set two was prepared u nder Contract Number DCA 100-78-C-0035. Computer Sciences Corporation prepared change

  2. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.

  3. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477

  4. Quick-Connect/Disconnect Joint For Truss Structures

    NASA Technical Reports Server (NTRS)

    Sprague, Benny B.

    1991-01-01

    Simple connector used for temporary structures and pipes. Truss connector joins and aligns structural members. Consists of two sections, one flanged and other with mating internal groove. When flanged half inserted in groove, moves lever of trigger mechanism upward. Cone then shoots into grooved half. Attached without tools in less than 2 seconds and taken apart just as quickly and easily. Developed for assembling structures in outer space, also useful for temporary terrestrial structures like scaffolds and portable bleachers. With modifications, used to join sections of pipelines carrying liquids or gases.

  5. A Computer-Based Laboratory Project for the Study of Stimulus Generalization and Peak Shift

    ERIC Educational Resources Information Center

    Derenne, Adam; Loshek, Eevett

    2009-01-01

    This paper describes materials designed for classroom projects on stimulus generalization and peak shift. A computer program (originally written in QuickBASIC) is used for data collection and a Microsoft Excel file with macros organizes the raw data on a spreadsheet and creates generalization gradients. The program is designed for use with human…

  6. A City Parking Integration System Combined with Cloud Computing Technologies and Smart Mobile Devices

    ERIC Educational Resources Information Center

    Yeh, Her-Tyan; Chen, Bing-Chang; Wang, Bo-Xun

    2016-01-01

    The current study applied cloud computing technology and smart mobile devices combined with a streaming server for parking lots to plan a city parking integration system. It is also equipped with a parking search system, parking navigation system, parking reservation service, and car retrieval service. With this system, users can quickly find…

  7. Studying the Effects of Nuclear Weapons Using a Slide-Rule Computer

    ERIC Educational Resources Information Center

    Shastri, Ananda

    2007-01-01

    This paper describes the construction of a slide-rule computer that allows one to quickly determine magnitudes of several effects that result from the detonation of a nuclear device. Suggestions for exercises are also included that allow high school and college-level physics students to explore scenarios involving these effects. It is hoped that…

  8. LINUX, Virtualization, and the Cloud: A Hands-On Student Introductory Lab

    ERIC Educational Resources Information Center

    Serapiglia, Anthony

    2013-01-01

    Many students are entering Computer Science education with limited exposure to operating systems and applications other than those produced by Apple or Microsoft. This gap in familiarity with the Open Source community can quickly be bridged with a simple exercise that can also be used to strengthen two other important current computing concepts,…

  9. An Easily Assembled Laboratory Exercise in Computed Tomography

    ERIC Educational Resources Information Center

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-01-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near…

  10. Analyzing Sliding Stability of Structures Using the Modified Computer Program GWALL. Revision,

    DTIC Science & Technology

    1983-11-01

    R136 954 RNRLYZING SLIDING STRBILITY OF STRUCTURES USING THE i/i MODIFIED COMPUTER PRO..(U) ARMY ENGINEER WATERRYS EXPERIMENT STATION VICKSBURG MS...GWALL and/or the graphics software package, Graphics Compati- bility System (GCS). Input Features 4. GWALL is very easy to use because it allows the...Prepared Data File 9. Time-sharing computer systems do not always respond quickly to the userts commands, especially when there are many users

  11. Foundation of a Knowledge Representation System for Image Understanding.

    DTIC Science & Technology

    1980-10-01

    This is useful for holding the system together, for computing similarity between objects, for quickly retrieving desired information in as detailed a...mined by how much precision is needed to carry through the current computation . In Section 2, we discuss the OVS system itself, its structure and how...2.0 OVS SYSTEM Our goal here is to present the computational constraints involved in the design of a knowledge representation system which is

  12. Usability of a patient education and motivation tool using heuristic evaluation.

    PubMed

    Joshi, Ashish; Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew

    2009-11-06

    Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. PEMT was evaluated by three usability experts using Nielsen's usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4). We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT.

  13. Usability of a Patient Education and Motivation Tool Using Heuristic Evaluation

    PubMed Central

    Arora, Mohit; Dai, Liwei; Price, Kathleen; Vizer, Lisa; Sears, Andrew

    2009-01-01

    Background Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation. Objective The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT. Methods PEMT was evaluated by three usability experts using Nielsen’s usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic. Results A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4). Conclusion We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT. PMID:19897458

  14. MODELING OF HIGH SPEED FRICTION STIR SPOT WELDING USING A LAGRANGIAN FINITE ELEMENT APPROACH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miles, Michael; Karki, U.; Woodward, C.

    2013-09-03

    Friction stir spot welding (FSSW) has been shown to be capable of joining steels of very high strength, while also being very flexible in terms of controlling the heat of welding and the resulting microstructure of the joint. This makes FSSW a potential alternative to resistance spot welding (RSW) if tool life is sufficiently high, and if machine spindle loads are sufficiently low so that the process can be implemented on an industrial robot. Robots for spot welding can typically sustain vertical loads of about 8kN, but FSSW at tool speeds of less than 3000 rpm cause loads that aremore » too high, in the range of 11-14 kN. Therefore, in the current work tool speeds of 3000 rpm and higher were employed, in order to generate heat more quickly and to reduce welding loads to acceptable levels. The FSSW process was modeled using a finite element approach with the Forge® software package. An updated Lagrangian scheme with explicit time integration was employed to model the flow of the sheet material, subjected to boundary conditions of a rotating tool and a fixed backing plate [3]. The modeling approach can be described as two-dimensional, axisymmetric, but with an aspect of three dimensions in terms of thermal boundary conditions. Material flow was calculated from a velocity field which was two dimensional, but heat generated by friction was computed using a virtual rotational velocity component from the tool surface. An isotropic, viscoplastic Norton-Hoff law was used to model the evolution of material flow stress as a function of strain, strain rate, and temperature. The model predicted welding temperatures and the movement of the joint interface with reasonable accuracy for the welding of a dual phase 980 steel.« less

  15. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  16. Toward Rigorous Parameterization of Underconstrained Neural Network Models Through Interactive Visualization and Steering of Connectivity Generation

    PubMed Central

    Nowke, Christian; Diaz-Pier, Sandra; Weyers, Benjamin; Hentschel, Bernd; Morrison, Abigail; Kuhlen, Torsten W.; Peyser, Alexander

    2018-01-01

    Simulation models in many scientific fields can have non-unique solutions or unique solutions which can be difficult to find. Moreover, in evolving systems, unique final state solutions can be reached by multiple different trajectories. Neuroscience is no exception. Often, neural network models are subject to parameter fitting to obtain desirable output comparable to experimental data. Parameter fitting without sufficient constraints and a systematic exploration of the possible solution space can lead to conclusions valid only around local minima or around non-minima. To address this issue, we have developed an interactive tool for visualizing and steering parameters in neural network simulation models. In this work, we focus particularly on connectivity generation, since finding suitable connectivity configurations for neural network models constitutes a complex parameter search scenario. The development of the tool has been guided by several use cases—the tool allows researchers to steer the parameters of the connectivity generation during the simulation, thus quickly growing networks composed of multiple populations with a targeted mean activity. The flexibility of the software allows scientists to explore other connectivity and neuron variables apart from the ones presented as use cases. With this tool, we enable an interactive exploration of parameter spaces and a better understanding of neural network models and grapple with the crucial problem of non-unique network solutions and trajectories. In addition, we observe a reduction in turn around times for the assessment of these models, due to interactive visualization while the simulation is computed. PMID:29937723

  17. [STS-41 Onboard 16mm Photography Quick Release

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This videotape features scenes of onboard activities. The videotape was shot by the crew. The scenes include the following: Ulysses' deployment, middeck experiments, computer workstations, and Earth payload bay views.

  18. 76 FR 7868 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... Special Emphasis Panel, Small Business: Computational Biology, Image Processing and Data Mining. Date... for Scientific Review Special Emphasis Panel, Quick Trial on Imaging and Image-Guided Intervention...

  19. Comprehensive evaluation of transportation projects : a toolkit for sketch planning.

    DOT National Transportation Integrated Search

    2010-10-01

    A quick-response project-planning tool can be extremely valuable in anticipating the congestion, safety, : emissions, and other impacts of large-scale network improvements and policy implementations. This report : identifies the advantages and limita...

  20. Graphics Software Packages as Instructional Tools.

    ERIC Educational Resources Information Center

    Chiavaroli, Julius J.; Till, Ronald J.

    1985-01-01

    Graphics software can assist hearing-impaired students in visualizing and comparing ideas and can also demonstrate spatial relations and encourage creativity. Teachers and students can create and present data, diagrams, drawings, or charts quickly and accurately. (Author/CL)

  1. Data-driven traffic impact assessment tool for work zones.

    DOT National Transportation Integrated Search

    2017-03-01

    Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...

  2. Web-Based Model Visualization Tools to Aid in Model Optimization and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Alder, J.; van Griensven, A.; Meixner, T.

    2003-12-01

    Individuals applying hydrologic models have a need for a quick easy to use visualization tools to permit them to assess and understand model performance. We present here the Interactive Hydrologic Modeling (IHM) visualization toolbox. The IHM utilizes high-speed Internet access, the portability of the web and the increasing power of modern computers to provide an online toolbox for quick and easy model result visualization. This visualization interface allows for the interpretation and analysis of Monte-Carlo and batch model simulation results. Often times a given project will generate several thousands or even hundreds of thousands simulations. This large number of simulations creates a challenge for post-simulation analysis. IHM's goal is to try to solve this problem by loading all of the data into a database with a web interface that can dynamically generate graphs for the user according to their needs. IHM currently supports: a global samples statistics table (e.g. sum of squares error, sum of absolute differences etc.), top ten simulations table and graphs, graphs of an individual simulation using time step data, objective based dotty plots, threshold based parameter cumulative density function graphs (as used in the regional sensitivity analysis of Spear and Hornberger) and 2D error surface graphs of the parameter space. IHM is ideal for the simplest bucket model to the largest set of Monte-Carlo model simulations with a multi-dimensional parameter and model output space. By using a web interface, IHM offers the user complete flexibility in the sense that they can be anywhere in the world using any operating system. IHM can be a time saving and money saving alternative to spending time producing graphs or conducting analysis that may not be informative or being forced to purchase or use expensive and proprietary software. IHM is a simple, free, method of interpreting and analyzing batch model results, and is suitable for novice to expert hydrologic modelers.

  3. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning. © 2012 American Association of Anatomists.

  4. Quickly updatable hologram images with high performance photorefractive polymer composites

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Naoto; Kinashi, Kenji; Nonomura, Asato; Sakai, Wataru

    2012-02-01

    We present here quickly updatable hologram images using high performance photorefractive (PR) polymer composite based on poly(N-vinyl carbazole) (PVCz). PVCz is one of the pioneer materials for photoconductive polymer. PVCz/7- DCST/CzEPA/TNF (44/35/20/1 by wt) gives high diffraction efficiency of 68 % at E = 45 V/μm with fast response speed. Response speed of optical diffraction is the key parameter for real-time 3D holographic display. Key parameter for obtaining quickly updatable hologram images is to control the glass transition temperature lower enough to enhance chromophore orientation. Object image of the reflected coin surface recorded with reference beam at 532 nm (green beam) in the PR polymer composite is simultaneously reconstructed using a red probe beam at 642 nm. Instead of using coin object, object image produced by a computer was displayed on a spatial light modulator (SLM) is used as an object for hologram. Reflected object beam from a SLM interfered with reference beam on PR polymer composite to record a hologram and simultaneously reconstructed by a red probe beam. Movie produced in a computer was recorded as a realtime hologram in the PR polymer composite and simultaneously clearly reconstructed with a video rate.

  5. Effects of Thai Dancing on Median Neurodynamic Response During 4-Hour Computer Use.

    PubMed

    Mekhora, Keerin; Septham, Chatdao; Jalayondeja, Wattana

    2015-06-01

    To investigate the effects of Thai dancing on median neurodynamic response during 4-hour computer use. Twenty-four healthy participants aged 20-30 years performed 5 minutes of Thai dancing including Prom See Na, Yoong Fon Hang, Sod Soy Mala, Lor Keaw and Cha Nee Rai Mai during a 10-minute break of 4-hour computer use. All participants were assessed for nerve tension by elbow range of motion ofupper limb neurodynamic test 1 (ULNT1) and components of quick test. The discomfort was measured by visual analogue discomfort scale (VADS). These measurements were assessed before and after computer work. The statistical analyses employed paired t-test for continuous outcome and Friedman's test. The median nerve tension (indicated by elbow range of motion) was significantly reduced at before and after work, when 5 minutes of Thai dancing was introduced during the break. While components of the quick test emphasized that Thai dance immediately helped reduce the median nerve tension. The VADS in eight body areas increased over the period of 4 hours, but decreased after performing Thai dancing (p<0.05). Thai dancing helped relieve median nerve tension and body discomfort. It may be recommended as an exercise during break for computer users who continuously work to prevent WMSDs.

  6. Distributed Automated Medical Robotics to Improve Medical Field Operations

    DTIC Science & Technology

    2010-04-01

    ROBOT PATIENT INTERFACE Robotic trauma diagnosis and intervention is performed using instruments and tools mounted on the end of a robotic manipulator...manipulator to respond quickly enough to accommodate for motion due to high inertia and inaccuracies caused by low stiffness at the tool point. Ultrasonic...program was licensed to Intuitive Surgical, Inc and subsequently morphed into the daVinci surgical system. The daVinci has been widely applied in

  7. Software Aids for radiologists: Part 1, Useful Photoshop skills.

    PubMed

    Gross, Joel A; Thapa, Mahesh M

    2012-12-01

    The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.

  8. Electronic Resources Evaluation Central: Using Off-the-Shelf Software, Web 2.0 Tools, and LibGuides to Manage an Electronic Resources Evaluation Process

    ERIC Educational Resources Information Center

    England, Lenore; Fu, Li

    2011-01-01

    A critical part of electronic resources management, the electronic resources evaluation process is multi-faceted and includes a seemingly endless range of resources and tools involving numerous library staff. A solution is to build a Web site to bring all of the components together that can be implemented quickly and result in an organizational…

  9. Software on diffractive optics and computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Doskolovich, Leonid L.; Golub, Michael A.; Kazanskiy, Nikolay L.; Khramov, Alexander G.; Pavelyev, Vladimir S.; Seraphimovich, P. G.; Soifer, Victor A.; Volotovskiy, S. G.

    1995-01-01

    The `Quick-DOE' software for an IBM PC-compatible computer is aimed at calculating the masks of diffractive optical elements (DOEs) and computer generated holograms, computer simulation of DOEs, and for executing a number of auxiliary functions. In particular, among the auxiliary functions are the file format conversions, mask visualization on display from a file, implementation of fast Fourier transforms, and arranging and preparation of composite images for the output on a photoplotter. The software is aimed for use by opticians, DOE designers, and the programmers dealing with the development of the program for DOE computation.

  10. Is 1984 Coming or Is It Here?

    ERIC Educational Resources Information Center

    McGowan, William F.

    1983-01-01

    Warns that relying on computers or quick accountability solutions to our educational or moral dilemma only increases the forces of both good and evil and hastens the day of "1984" and Big Brother. (MM)

  11. What Works Clearinghouse Quick Review: "Conceptualizing Astronomical Scale: Virtual Simulations on Handheld Tablet Computers Reverse Misconceptions"

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2014

    2014-01-01

    This study examined how using two different ways of displaying the solar system--a true-to-scale mode vs. an orrery mode--affected students' knowledge of astronomical concepts. Solar system displays were presented in a software application on a handheld tablet computer. In the true-to-scale mode, users navigated a simulated three-dimensional solar…

  12. Using Math Apps for Improving Student Learning: An Exploratory Study in an Inclusive Fourth Grade Classroom

    ERIC Educational Resources Information Center

    Zhang, Meilan; Trussell, Robert P.; Gallegos, Benjamin; Asam, Rasmiyeh R.

    2015-01-01

    Recent years have seen a quick expansion of tablet computers in households and schools. One of the educational affordances of tablet computers is using math apps to engage students in mathematics learning. However, given the short history of the mobile devices, little research exists on the effectiveness of math apps, particularly for struggling…

  13. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  14. Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.

    2013-12-01

    This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.

  15. CloudSat Reflectivity Data Visualization Inside Hurricanes

    NASA Technical Reports Server (NTRS)

    Suzuki, Shigeru; Wright, John R.; Falcon, Pedro C.

    2011-01-01

    Animations and other outreach products have been created and released to the public quickly after the CloudSat spacecraft flew over hurricanes. The automated script scans through the CloudSat quicklook data to find significant atmospheric moisture content. Once such a region is found, data from multiple sources is combined to produce the data products and the animations. KMZ products are quickly generated from the quicklook data for viewing in Google Earth and other tools. Animations are also generated to show the atmospheric moisture data in context with the storm cloud imagery. Global images from GOES satellites are shown to give context. The visualization provides better understanding of the interior of the hurricane storm clouds, which is difficult to observe directly. The automated process creates the finished animation in the High Definition (HD) video format for quick release to the media and public.

  16. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2013-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories’ core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI’s industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI’s academic participants (Carnegie Mellon University, Princeton University, West Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS-ANOVA-UQ) for calibration and validation of CFD models. 8. A new basic data submodel in Aspen Plus format for a representative high viscosity capture solvent, 2-MPZ system. 9. An updated RM tool for CFD (REVEAL) that can create a RM from MFIX. A new lightweight, stand-alone version will be available in late 2013. 10. An updated RM integration tool to convert the RM from REVEAL into a CAPE-OPEN or ACM model for use in a process simulator. 11. An updated suite of unified steady-state and dynamic process models for solid sorbent carbon capture included bubbling fluidized bed and moving bed reactors. 12. An updated and unified set of compressor models including steady-state design point model and dynamic model with surge detection. 13. A new framework for the synthesis and optimization of coal oxycombustion power plants using advanced optimization algorithms. This release focuses on modeling and optimization of a cryogenic air separation unit (ASU). 14. A new technical risk model in spreadsheet format. 15. An updated version of the sorbent kinetic/equilibrium model for parameter estimation for the 1st generation sorbent model. 16. An updated process synthesis superstructure model to determine optimal process configurations utilizing surrogate models from ALAMO for adsorption and regeneration in a solid sorbent process. 17. Validation models for NETL Carbon Capture Unit utilizing sorbent AX. Additional validation models will be available for sorbent 32D in 2014. 18. An updated hollow fiber membrane model and system example for carbon capture. 19. An updated reference power plant model in Thermoflex that includes additional steam extraction and reinjection points to enable heat integration module. 20. An updated financial risk model in spreadsheet format.« less

  17. Efficient volumetric estimation from plenoptic data

    NASA Astrophysics Data System (ADS)

    Anglin, Paul; Reeves, Stanley J.; Thurow, Brian S.

    2013-03-01

    The commercial release of the Lytro camera, and greater availability of plenoptic imaging systems in general, have given the image processing community cost-effective tools for light-field imaging. While this data is most commonly used to generate planar images at arbitrary focal depths, reconstruction of volumetric fields is also possible. Similarly, deconvolution is a technique that is conventionally used in planar image reconstruction, or deblurring, algorithms. However, when leveraged with the ability of a light-field camera to quickly reproduce multiple focal planes within an imaged volume, deconvolution offers a computationally efficient method of volumetric reconstruction. Related research has shown than light-field imaging systems in conjunction with tomographic reconstruction techniques are also capable of estimating the imaged volume and have been successfully applied to particle image velocimetry (PIV). However, while tomographic volumetric estimation through algorithms such as multiplicative algebraic reconstruction techniques (MART) have proven to be highly accurate, they are computationally intensive. In this paper, the reconstruction problem is shown to be solvable by deconvolution. Deconvolution offers significant improvement in computational efficiency through the use of fast Fourier transforms (FFTs) when compared to other tomographic methods. This work describes a deconvolution algorithm designed to reconstruct a 3-D particle field from simulated plenoptic data. A 3-D extension of existing 2-D FFT-based refocusing techniques is presented to further improve efficiency when computing object focal stacks and system point spread functions (PSF). Reconstruction artifacts are identified; their underlying source and methods of mitigation are explored where possible, and reconstructions of simulated particle fields are provided.

  18. Using tablet computers to teach evidence-based medicine to pediatrics residents: a prospective study.

    PubMed

    Soma, David B; Homme, Jason H; Jacobson, Robert M

    2013-01-01

    We sought to determine if tablet computers-supported by a laboratory experience focused upon skill-development-would improve not only evidence-based medicine (EBM) knowledge but also skills and behavior. We conducted a prospective cohort study where we provided tablet computers to our pediatric residents and then held a series of laboratory sessions focused on speed and efficiency in performing EBM at the bedside. We evaluated the intervention with pre- and postintervention tests and surveys based on a validated tool available for use on MedEdPORTAL. The attending pediatric hospitalists also completed surveys regarding their observations of the residents' behavior. All 38 pediatric residents completed the preintervention test and the pre- and postintervention surveys. All but one completed the posttest. All 7 attending pediatric hospitalists completed their surveys. The testing, targeted to assess EBM knowledge, revealed a median increase of 16 points out of a possible 60 points (P < .0001). We found substantial increases in individual resident's test scores across all 3 years of residency. Resident responses demonstrated statistically significant improvements in self-reported comfort with 6 out of 6 EBM skills and statistically significant increases in self-reported frequencies for 4 out of 7 EBM behaviors. Attending pediatric hospitalists reported improvements in 5 of 7 resident behaviors. This novel approach for teaching EBM to pediatric residents improved knowledge, skills, and behavior through the introduction of a tablet computer and laboratory sessions designed to teach the quick and efficient application of EBM at the bedside. Copyright © 2013 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  19. Model Atmospheres and Spectral Irradiance Library of the Exoplanet Host Stars Observed in the MUSCLES Survey

    NASA Astrophysics Data System (ADS)

    Linsky, Jeffrey

    2017-08-01

    We propose to compute state-of-the-art model atmospheres (photospheres, chromospheres, transition regions and coronae) of the 4 K and 7 M exoplanet host stars observed by HST in the MUSCLES Treasury Survey, the nearest host star Proxima Centauri, and TRAPPIST-1. Our semi-empirical models will fit theunique high-resolution panchromatic (X-ray to infrared) spectra of these stars in the MAST High-Level Science Products archive consisting of COS and STIS UV spectra and near-simultaneous Chandra, XMM-Newton, and ground-based observations. We will compute models with the fully tested SSRPM computer software incorporating 52 atoms and ions in full non-LTE (435,986 spectral lines) and the 20 most-abundant diatomic molecules (about 2 million lines). This code has successfully fit the panchromatic spectrum of the M1.5 V exoplanet host star GJ 832 (Fontenla et al. 2016), the first M star with such a detailed model, and solar spectra. Our models will (1) predict the unobservable extreme-UV spectra, (2) determine radiative energy losses and balancing heating rates throughout these atmospheres, (3) compute a stellar irradiance library needed to describe the radiation environment of potentially habitable exoplanets to be studied by TESS and JWST, and (4) in the long post-HST era when UV observations will not be possible, the stellar irradiance library will be a powerful tool for predicting the panchromatic spectra of host stars that have only limited spectral coverage, in particular no UV spectra. The stellar models and spectral irradiance library will be placed quickly in MAST.

  20. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    PubMed

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license.

  1. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    PubMed Central

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license. PMID:22723865

  2. Methods and Apparatus for Autonomous Robotic Control

    NASA Technical Reports Server (NTRS)

    Gorshechnikov, Anatoly (Inventor); Livitz, Gennady (Inventor); Versace, Massimiliano (Inventor); Palma, Jesse (Inventor)

    2017-01-01

    Sensory processing of visual, auditory, and other sensor information (e.g., visual imagery, LIDAR, RADAR) is conventionally based on "stovepiped," or isolated processing, with little interactions between modules. Biological systems, on the other hand, fuse multi-sensory information to identify nearby objects of interest more quickly, more efficiently, and with higher signal-to-noise ratios. Similarly, examples of the OpenSense technology disclosed herein use neurally inspired processing to identify and locate objects in a robot's environment. This enables the robot to navigate its environment more quickly and with lower computational and power requirements.

  3. Satellite Remote Sensing Tools at the Alaska Volcano Observatory

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Dean, K.; Webley, P.; Bailey, J.; Valcic, L.

    2008-12-01

    Volcanoes rarely conform to schedules or convenience. This is even more the case for remote volcanoes that still have impact on local infrastructure and air traffic. With well over 100 eruptions in the North Pacific over 20 years, the Alaska Volcano Observatory has developed a series of web-based tools to rapidly assess satellite imagery of volcanic eruptions from virtually anywhere. These range from automated alarms systems to detect thermal anomalies and ash plumes at volcanoes, as well as efficient image processing that can be done at a moments notice from any computer linked to the internet. The thermal anomaly detection algorithm looks for warm pixels several standard deviations above the background as well as pixels which show stronger mid infrared (3-5 microns) signals relative to available thermal channels (10-12 microns). The ash algorithm primarily uses the brightness temperature difference of two thermal bands, but also looks for shape of clouds and noise elimination. The automated algorithms are far from perfect, with 60-70% success rates, but improve with each eruptions. All of the data is available to the community online in a variety of forms which provide rudimentary processing. The website, avo-animate.images.alaska.edu, is designed for use by AVO's partners and "customers" to provide quick synoptic views of volcanic activity. These tools also have been essential in AVO's efforts in recent years and provide a model for rapid response to eruptions at distant volcanoes anywhere in the world. animate.images.alaska.edu

  4. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  5. Space geodetic tools provide early warnings for earthquakes and volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Aoki, Yosuke

    2017-04-01

    Development of space geodetic techniques such as Global Navigation Satellite System and Synthetic Aperture Radar in last few decades allows us to monitor deformation of Earth's surface in unprecedented spatial and temporal resolution. These observations, combined with fast data transmission and quick data processing, enable us to quickly detect and locate earthquakes and volcanic eruptions and assess potential hazards such as strong earthquake shaking, tsunamis, and volcanic eruptions. These techniques thus are key parts of early warning systems, help identify some hazards before a cataclysmic event, and improve the response to the consequent damage.

  6. TSPmap, a tool making use of traveling salesperson problem solvers in the efficient and accurate construction of high-density genetic linkage maps.

    PubMed

    Monroe, J Grey; Allen, Zachariah A; Tanger, Paul; Mullen, Jack L; Lovell, John T; Moyers, Brook T; Whitley, Darrell; McKay, John K

    2017-01-01

    Recent advances in nucleic acid sequencing technologies have led to a dramatic increase in the number of markers available to generate genetic linkage maps. This increased marker density can be used to improve genome assemblies as well as add much needed resolution for loci controlling variation in ecologically and agriculturally important traits. However, traditional genetic map construction methods from these large marker datasets can be computationally prohibitive and highly error prone. We present TSPmap , a method which implements both approximate and exact Traveling Salesperson Problem solvers to generate linkage maps. We demonstrate that for datasets with large numbers of genomic markers (e.g. 10,000) and in multiple population types generated from inbred parents, TSPmap can rapidly produce high quality linkage maps with low sensitivity to missing and erroneous genotyping data compared to two other benchmark methods, JoinMap and MSTmap . TSPmap is open source and freely available as an R package. With the advancement of low cost sequencing technologies, the number of markers used in the generation of genetic maps is expected to continue to rise. TSPmap will be a useful tool to handle such large datasets into the future, quickly producing high quality maps using a large number of genomic markers.

  7. Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort

    NASA Technical Reports Server (NTRS)

    Bao, Han P.

    1990-01-01

    Further experimentations were made to improve the design and fabrication techniques of the integrated sole. The sole design is shown to be related to the foot position requirements and the actual shape of the foot including presence of neurotropic ulcers or other infections. Factors for consideration were: heel pitch, balance line, and rigidity conditions of the foot. Machining considerations were also part of the design problem. Among these considerations, widths of each contour, tool motion, tool feed rate, depths of cut, and slopes of cut at the boundary were the key elements. The essential fabrication techniques evolved around the idea of machining a mold then, using quick-firm latex material, casting the sole through the mold. Two main mold materials were experimented with: plaster and wood. Plaster was very easy to machine and shape but could barely support the pressure in the hydraulic press required by the casting process. Wood was found to be quite effective in terms of relative cost, strength, and surface smoothness except for the problem of cutting against the fibers which could generate ragged surfaces. The programming efforts to convert the original dBase programs into C programs so that they could be executed on the SUN Computer at North Carolina State University are discussed.

  8. The PROMIS physical function correlates with the QuickDASH in patients with upper extremity illness.

    PubMed

    Overbeek, Celeste L; Nota, Sjoerd P F T; Jayakumar, Prakash; Hageman, Michiel G; Ring, David

    2015-01-01

    To assess disability more efficiently with less burden on the patient, the National Institutes of Health has developed the Patient Reported Outcomes Measurement Information System (PROMIS) Physical Function-an instrument based on item response theory and using computer adaptive testing (CAT). Initially, upper and lower extremity disabilities were not separated and we were curious if the PROMIS Physical Function CAT could measure upper extremity disability and the Quick Disability of Arm, Shoulder and Hand (QuickDASH). We aimed to find correlation between the PROMIS Physical Function and the QuickDASH questionnaires in patients with upper extremity illness. Secondarily, we addressed whether the PROMIS Physical Function and QuickDASH correlate with the PROMIS Depression CAT and PROMIS Pain Interference CAT instruments. Finally, we assessed factors associated with QuickDASH and PROMIS Physical Function in multivariable analysis. A cohort of 93 outpatients with upper extremity illnesses completed the QuickDASH and three PROMIS CAT questionnaires: Physical Function, Pain Interference, and Depression. Pain intensity was measured with an 11-point ordinal measure (0-10 numeric rating scale). Correlation between PROMIS Physical Function and the QuickDASH was assessed. Factors that correlated with the PROMIS Physical Function and QuickDASH were assessed in multivariable regression analysis after initial bivariate analysis. There was a moderate correlation between the PROMIS Physical Function and the QuickDASH questionnaire (r=-0.55, p<0.001). Greater disability as measured with the PROMIS and QuickDASH correlated most strongly with PROMIS Depression (r=-0.35, p<0.001 and r=0.34, p<0.001 respectively) and Pain Interference (r=-0.51, p<0.001 and r=0.74, p<0.001 respectively). The factors accounting for the variability in PROMIS scores are comparable to those for the QuickDASH except that the PROMIS Physical Function is influenced by other pain conditions while the QuickDASH is not. The PROMIS Physical Function instrument may be used as an upper extremity disability measure, as it correlates with the QuickDASH questionnaire, and both instruments are influenced most strongly by the degree to which pain interferes with achieving goals. Level III, diagnostic study. See the Instructions for Authors for a complete description of levels of evidence.

  9. A screening tool to prioritize public health risk associated with accidental or deliberate release of chemicals into the atmosphere

    PubMed Central

    2013-01-01

    The Chemical Events Working Group of the Global Health Security Initiative has developed a flexible screening tool for chemicals that present a risk when accidentally or deliberately released into the atmosphere. The tool is generic, semi-quantitative, independent of site, situation and scenario, encompasses all chemical hazards (toxicity, flammability and reactivity), and can be easily and quickly implemented by non-subject matter experts using freely available, authoritative information. Public health practitioners and planners can use the screening tool to assist them in directing their activities in each of the five stages of the disaster management cycle. PMID:23517410

  10. Energy Sector Security through a System for Intelligent, Learning Network Configuration Monitoring and Management (“Essence”)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Craig; Larmouth, Robert

    The project was conceived and executed with the overarching objective to provide cost effective tools to cooperative utilities that enabled them to quickly detect, characterize and take remediative action against cyber attacks.

  11. galstreams: Milky Way streams footprint library and toolkit

    NASA Astrophysics Data System (ADS)

    Mateu, Cecilia

    2017-11-01

    galstreams provides a compilation of spatial information for known stellar streams and overdensities in the Milky Way and includes Python tools for visualizing them. ASCII tables are also provided for quick viewing of the stream's footprints using TOPCAT (ascl:1101.010).

  12. 9th Annual Systems Engineering Conference: Volume 4 Thursday

    DTIC Science & Technology

    2006-10-26

    Connectivity, Speed, Volume • Enterprise application integration • Workflow integration or multi-media • Federated search capability • Link analysis and...categorization, federated search & automated discovery of information — Collaborative tools to quickly share relevant information Built on commercial

  13. Nanopore sequencing technology and tools for genome assembly: computational analysis of the current state, bottlenecks and future directions.

    PubMed

    Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur

    2018-04-02

    Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious and effective choices for each step of the genome assembly pipeline using nanopore sequence data. Also, with the help of bottlenecks we have found, developers can improve the current tools or build new ones that are both accurate and fast, to overcome the high error rates of the nanopore sequencing technology.

  14. [The Key Technology Study on Cloud Computing Platform for ECG Monitoring Based on Regional Internet of Things].

    PubMed

    Yang, Shu; Qiu, Yuyan; Shi, Bo

    2016-09-01

    This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.

  15. Evolving and Controlling Perimeter, Rendezvous, and Foraging Behaviors in a Computation-Free Robot Swarm

    DTIC Science & Technology

    2016-04-01

    cheap, disposable swarms of robots that can accomplish these tasks quickly and with- out much human supervision. While there has been a lot of work...have shown that swarms of robots so dumb that they have no computational power–they can’t even add or subtract, and have no memory can still collec...behaviors can be achieved using swarms of computation-free robots . Our work starts with the simple robot model proposed in [6] and adds a form of

  16. The use of National Weather Service Data to Compute the Dose to the MEOI.

    PubMed

    Vickers, Linda

    2018-05-01

    The Turner method is the "benchmark method" for computing the stability class that is used to compute the X/Q (s m). The Turner method should be used to ascertain the validity of X/Q results determined by other methods. This paper used site-specific meteorological data obtained from the National Weather Service. The Turner method described herein is simple, quick, accurate, and transparent because all of the data, calculations, and results are visible for verification and validation with published literature.

  17. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  18. Easing The Calculation Of Bolt-Circle Coordinates

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.

    1995-01-01

    Bolt Circle Calculation (BOLT-CALC) computer program used to reduce significant time consumed in manually computing trigonometry of rectangular Cartesian coordinates of holes in bolt circle as shown on blueprint or drawing. Eliminates risk of computational errors, particularly in cases involving many holes or in cases in which coordinates expressed to many significant digits. Program assists in many practical situations arising in machine shops. Written in BASIC. Also successfully compiled and implemented by use of Microsoft's QuickBasic v4.0.

  19. A Novel Passive Robotic Tool Interface

    NASA Astrophysics Data System (ADS)

    Roberts, Paul

    2013-09-01

    The increased capability of space robotics has seen their uses increase from simple sample gathering and mechanical adjuncts to humans, to sophisticated multi- purpose investigative and maintenance tools that substitute for humans for many external space tasks. As with all space missions, reducing mass and system complexity is critical. A key component of robotic systems mass and complexity is the number of motors and actuators needed. MDA has developed a passive tool interface that, like a household power drill, permits a single tool actuator to be interfaced with many Tool Tips without requiring additional actuators to manage the changing and storage of these tools. MDA's Multifunction Tool interface permits a wide range of Tool Tips to be designed to a single interface that can be pre-qualified to torque and strength limits such that additional Tool Tips can be added to a mission's "tool kit" simply and quickly.

  20. Practical application of game theory based production flow planning method in virtual manufacturing networks

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2016-08-01

    Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

  1. Measuring, managing and maximizing performance of mineral processing plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1995-12-31

    The implementation of continuous quality improvement is the confluence of Total Quality Management, People Empowerment, Performance Indicators and Information Engineering. The supporting information technologies allow a mineral processor to narrow the gap between management business objectives and the process control level. One of the most important contributors is the user friendliness and flexibility of the personal computer in a client/server environment. This synergistic combination when used for real time performance monitoring translates into production cost savings, improved communications and enhanced decision support. Other savings come from reduced time to collect data and perform tedious calculations, act quickly with fresh newmore » data, generate and validate data to be used by others. This paper presents an integrated view of plant management. The selection of the proper tools for continuous quality improvement are described. The process of selecting critical performance monitoring indices for improved plant performance are discussed. The importance of a well balanced technological improvement, personnel empowerment, total quality management and organizational assets are stressed.« less

  2. Advantages of High Tolerance Measurements in Fusion Environments Applying Photogrammetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T. Dodson, R. Ellis, C. Priniski, S. Raftopoulos, D. Stevens, M. Viola

    2009-02-04

    Photogrammetry, a state-of-the-art technique of metrology employing digital photographs as the vehicle for measurement, has been investigated in the fusion environment. Benefits of this high tolerance methodology include relatively easy deployment for multiple point measurements and deformation/distortion studies. Depending on the equipment used, photogrammetric systems can reach tolerances of 25 microns (0.001 in) to 100 microns (0.004 in) on a 3-meter object. During the fabrication and assembly of the National Compact Stellarator Experiment (NCSX) the primary measurement systems deployed were CAD coordinate-based computer metrology equipment and supporting algorithms such as both interferometer-aided (IFM) and absolute distance measurementbased (ADM) laser trackers,more » as well as portable Coordinate Measurement Machine (CMM) arms. Photogrammetry was employed at NCSX as a quick and easy tool to monitor coil distortions incurred during welding operations of the machine assembly process and as a way to reduce assembly downtime for metrology processes.« less

  3. ANNA: A Convolutional Neural Network Code for Spectroscopic Analysis

    NASA Astrophysics Data System (ADS)

    Lee-Brown, Donald; Anthony-Twarog, Barbara J.; Twarog, Bruce A.

    2018-01-01

    We present ANNA, a Python-based convolutional neural network code for the automated analysis of stellar spectra. ANNA provides a flexible framework that allows atmospheric parameters such as temperature and metallicity to be determined with accuracies comparable to those of established but less efficient techniques. ANNA performs its parameterization extremely quickly; typically several thousand spectra can be analyzed in less than a second. Additionally, the code incorporates features which greatly speed up the training process necessary for the neural network to measure spectra accurately, resulting in a tool that can easily be run on a single desktop or laptop computer. Thus, ANNA is useful in an era when spectrographs increasingly have the capability to collect dozens to hundreds of spectra each night. This talk will cover the basic features included in ANNA and demonstrate its performance in two use cases: an open cluster abundance analysis involving several hundred spectra, and a metal-rich field star study. Applicability of the code to large survey datasets will also be discussed.

  4. FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.

    PubMed

    Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver

    2014-06-14

    Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.

  5. Building an infrastructure at PICKSC for the educational use of kinetic software tools

    NASA Astrophysics Data System (ADS)

    Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; Amorim, L. D.; An, W.; Dalichaouch, T. N.; Davidson, A.; Joglekar, A.; Li, F.; May, J.; Touati, M.; Xu, X. L.; Yu, P.

    2016-10-01

    One aim of the Particle-In-Cell and Kinetic Simulation Center (PICKSC) at UCLA is to coordinate a community development of educational software for undergraduate and graduate courses in plasma physics and computer science. The rich array of physical behaviors exhibited by plasmas can be difficult to grasp by students. If they are given the ability to quickly and easily explore plasma physics through kinetic simulations, and to make illustrative visualizations of plasma waves, particle motion in electromagnetic fields, instabilities, or other phenomena, then they can be equipped with first-hand experiences that inform and contextualize conventional texts and lectures. We are developing an infrastructure for any interested persons to take our kinetic codes, run them without any prerequisite knowledge, and explore desired scenarios. Furthermore, we are actively interested in any ideas or input from other plasma physicists. This poster aims to illustrate what we have developed and gather a community of interested users and developers. Supported by NSF under Grant ACI-1339893.

  6. Making the EZ Choice

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Analytical Mechanics Associates, Inc. (AMA), of Hampton, Virginia, created the EZopt software application through Small Business Innovation Research (SBIR) funding from NASA's Langley Research Center. The new software is a user-friendly tool kit that provides quick and logical solutions to complex optimal control problems. In its most basic form, EZopt converts process data into math equations and then proceeds to utilize those equations to solve problems within control systems. EZopt successfully proved its advantage when applied to short-term mission planning and onboard flight computer implementation. The technology has also solved multiple real-life engineering problems faced in numerous commercial operations. For instance, mechanical engineers use EZopt to solve control problems with robots, while chemical plants implement the application to overcome situations with batch reactors and temperature control. In the emerging field of commercial aerospace, EZopt is able to optimize trajectories for launch vehicles and perform potential space station- keeping tasks. Furthermore, the software also helps control electromagnetic devices in the automotive industry.

  7. User engineering: A new look at system engineering

    NASA Technical Reports Server (NTRS)

    Mclaughlin, Larry L.

    1987-01-01

    User Engineering is a new System Engineering perspective responsible for defining and maintaining the user view of the system. Its elements are a process to guide the project and customer, a multidisciplinary team including hard and soft sciences, rapid prototyping tools to build user interfaces quickly and modify them frequently at low cost, and a prototyping center for involving users and designers in an iterative way. The main consideration is reducing the risk that the end user will not or cannot effectively use the system. The process begins with user analysis to produce cognitive and work style models, and task analysis to produce user work functions and scenarios. These become major drivers of the human computer interface design which is presented and reviewed as an interactive prototype by users. Feedback is rapid and productive, and user effectiveness can be measured and observed before the system is built and fielded. Requirements are derived via the prototype and baselined early to serve as an input to the architecture and software design.

  8. A three-dimensional algebraic grid generation scheme for gas turbine combustors with inclined slots

    NASA Technical Reports Server (NTRS)

    Yang, S. L.; Cline, M. C.; Chen, R.; Chang, Y. L.

    1993-01-01

    A 3D algebraic grid generation scheme is presented for generating the grid points inside gas turbine combustors with inclined slots. The scheme is based on the 2D transfinite interpolation method. Since the scheme is a 2D approach, it is very efficient and can easily be extended to gas turbine combustors with either dilution hole or slot configurations. To demonstrate the feasibility and the usefulness of the technique, a numerical study of the quick-quench/lean-combustion (QQ/LC) zones of a staged turbine combustor is given. Preliminary results illustrate some of the major features of the flow and temperature fields in the QQ/LC zones. Formation of co- and counter-rotating bulk flow and shape temperature fields can be observed clearly, and the resulting patterns are consistent with experimental observations typical of the confined slanted jet-in-cross flow. Numerical solutions show the method to be an efficient and reliable tool for generating computational grids for analyzing gas turbine combustors with slanted slots.

  9. Aspect Ratio of Receiver Node Geometry based Indoor WLAN Propagation Model

    NASA Astrophysics Data System (ADS)

    Naik, Udaykumar; Bapat, Vishram N.

    2017-08-01

    This paper presents validation of indoor wireless local area network (WLAN) propagation model for varying rectangular receiver node geometry. The rectangular client node configuration is a standard node arrangement in computer laboratories of academic institutes and research organizations. The model assists to install network nodes for the better signal coverage. The proposed model is backed by wide ranging real time received signal strength measurements at 2.4 GHz. The shadow fading component of signal propagation under realistic indoor environment is modelled with the dependency on varying aspect ratio of the client node geometry. The developed new model is useful in predicting indoor path loss for IEEE 802.11b/g WLAN. The new model provides better performance in comparison to well known International Telecommunication Union and free space propagation models. It is shown that the proposed model is simple and can be a useful tool for indoor WLAN node deployment planning and quick method for the best utilisation of the office space.

  10. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.

  11. Smartphone tool to collect repeated 24 h dietary recall data in Nepal.

    PubMed

    Harris-Fry, Helen; Beard, B James; Harrisson, Tom; Paudel, Puskar; Shrestha, Niva; Jha, Sonali; Shrestha, Bhim P; Manandhar, Dharma S; Costello, Anthony; Saville, Naomi M

    2018-02-01

    To outline the development of a smartphone-based tool to collect thrice-repeated 24 h dietary recall data in rural Nepal, and to describe energy intakes, common errors and researchers' experiences using the tool. We designed a novel tool to collect multi-pass 24 h dietary recalls in rural Nepal by combining the use of a CommCare questionnaire on smartphones, a paper form, a QR (quick response)-coded list of foods and a photographic atlas of portion sizes. Twenty interviewers collected dietary data on three non-consecutive days per respondent, with three respondents per household. Intakes were converted into nutrients using databases on nutritional composition of foods, recipes and portion sizes. Dhanusha and Mahottari districts, Nepal. Pregnant women, their mothers-in-law and male household heads. Energy intakes assessed in 150 households; data corrections and our experiences reported from 805 households and 6765 individual recalls. Dietary intake estimates gave plausible values, with male household heads appearing to have higher energy intakes (median (25th-75th centile): 12 079 (9293-14 108) kJ/d) than female members (8979 (7234-11 042) kJ/d for pregnant women). Manual editing of data was required when interviewers mistook portions for food codes and for coding items not on the food list. Smartphones enabled quick monitoring of data and interviewer performance, but we initially faced technical challenges with CommCare forms crashing. With sufficient time dedicated to development and pre-testing, this novel smartphone-based tool provides a useful method to collect data. Future work is needed to further validate this tool and adapt it for other contexts.

  12. Enhanced quasi-static particle-in-cell simulation of electron cloud instabilities in circular accelerators

    NASA Astrophysics Data System (ADS)

    Feng, Bing

    Electron cloud instabilities have been observed in many circular accelerators around the world and raised concerns of future accelerators and possible upgrades. In this thesis, the electron cloud instabilities are studied with the quasi-static particle-in-cell (PIC) code QuickPIC. Modeling in three-dimensions the long timescale propagation of beam in electron clouds in circular accelerators requires faster and more efficient simulation codes. Thousands of processors are easily available for parallel computations. However, it is not straightforward to increase the effective speed of the simulation by running the same problem size on an increasingly number of processors because there is a limit to domain size in the decomposition of the two-dimensional part of the code. A pipelining algorithm applied on the fully parallelized particle-in-cell code QuickPIC is implemented to overcome this limit. The pipelining algorithm uses multiple groups of processors and optimizes the job allocation on the processors in parallel computing. With this novel algorithm, it is possible to use on the order of 102 processors, and to expand the scale and the speed of the simulation with QuickPIC by a similar factor. In addition to the efficiency improvement with the pipelining algorithm, the fidelity of QuickPIC is enhanced by adding two physics models, the beam space charge effect and the dispersion effect. Simulation of two specific circular machines is performed with the enhanced QuickPIC. First, the proposed upgrade to the Fermilab Main Injector is studied with an eye upon guiding the design of the upgrade and code validation. Moderate emittance growth is observed for the upgrade of increasing the bunch population by 5 times. But the simulation also shows that increasing the beam energy from 8GeV to 20GeV or above can effectively limit the emittance growth. Then the enhanced QuickPIC is used to simulate the electron cloud effect on electron beam in the Cornell Energy Recovery Linac (ERL) due to extremely small emittance and high peak currents anticipated in the machine. A tune shift is discovered from the simulation; however, emittance growth of the electron beam in electron cloud is not observed for ERL parameters.

  13. Second-Generation Six-Limbed Experimental Robot

    NASA Technical Reports Server (NTRS)

    Kennedy, Brett; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew; Garrett, Michael; Magnone, Lee

    2004-01-01

    The figure shows the LEMUR II - the second generation of the Limbed Excursion Mechanical Utility Robot (LEMUR), which was described in "Six-Legged Experimental Robot" (NPO-20897), NASA Tech Briefs, Vol. 25, No. 12 (December 2001), page 58. The LEMUR II incorporates a number of improvements, including new features, that extend its capabilities beyond those of its predecessor, which is now denoted the LEMUR I. To recapitulate: the LEMUR I was a six-limbed robot for demonstrating robotic capabilities for assembly, maintenance, and inspection. The LEMUR I was designed to be capable of walking autonomously along a truss structure toward a mechanical assembly at a prescribed location and to perform other operations. The LEMUR I was equipped with stereoscopic video cameras and image-data-processing circuitry for navigation and mechanical operations. It was also equipped with a wireless modem, through which it could be commanded remotely. Upon arrival at a mechanical assembly, the LEMUR I would perform simple mechanical operations with one or both of its front limbs. It could also transmit images to a host computer. Each of the six limbs of the LEMUR I was operated independently. Each of the four rear limbs had three degrees of freedom (DOFs), while each of the front two limbs had four DOFs. The front two limbs were designed to hold, operate, and/or be integrated with tools. The LEMUR I included an onboard computer equipped with an assortment of digital control circuits, digital input/output circuits, analog-to-digital converters for input, and digital-to-analog (D/A) converters for output. Feedback from optical encoders in the limb actuators was utilized for closed-loop microcomputer control of the positions and velocities of the actuators. The LEMUR II incorporates the following improvements over the LEMUR I: a) The drive trains for the joints of the LEMUR II are more sophisticated, providing greater torque and accuracy. b) The six limbs are arranged symmetrically about a hexagonal body platform instead of in straight lines along the sides. This symmetrical arrangement is more conducive to omnidirectional movement in a plane. c) The number of degrees of freedom of each of the rear four limbs has been increased by one. Now, every limb has four degrees of freedom: three at the hip (or shoulder, depending on one s perspective) and one at the knee (or elbow, depending on one s perspective). d) Now every limb (instead of only the two front limbs) can perform operations. For this purpose, each limb is tipped with an improved quick-release mechanism for swapping of end-effector tools. e) New end-effector tools have been developed. These include an instrumented rotary driver that accepts all tool bits that have 0.125-in. (3.175-mm)-diameter shanks, a charge-coupled-device video camera, a super bright light-emitting diode for illuminating the work area of the robot, and a generic collet tool that can be quickly and inexpensively modified to accept any cylindrical object up to 0.5 in. (12.7 mm) in diameter. f) The stereoscopic cameras are mounted on a carriage that moves along a circular track, thereby providing for omnidirectional machine vision. g) The control software has been augmented with software that implements innovations reported in two prior NASA Tech Briefs articles: the HIPS algorithm ["Hybrid Image-Plane/Stereo Manipulation" (NPO-30492), Vol. 28, No. 7 (July 2004), page 55] and the CAMPOUT architecture ["An Architecture for Controlling Multiple Robots" (NPO-30345), Vol. 28, No. 10 (October 2004), page 65].

  14. The Basic Radar Altimetry Toolbox for Sentinel 3 Users

    NASA Astrophysics Data System (ADS)

    Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme

    2013-04-01

    The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to altimetry, showing its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "data use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The upcoming release that is on the forge will focus on Sentinel 3 Surface Topography Mission that is build on the successful heritage of ERS, Envisat and Cryosat. The first of the two sentinel is expected to be launched in 2014. It will have on-board a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter and will provide measurements at a resolution of ~300m in SAR mode along track. Sentinel 3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The future version will provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and made them aware of the great potential of SAR altimetery for coastal and inland applications. The BRAT software is distributed under the GNU GPL open-source license and can be obtained, along with all the documentation (including the tutorial), on the webstite: http://earth.esa.int/brat

  15. QuickAssist: Reading and Learning Vocabulary Independently with the Help of CALL and NLP Technologies

    ERIC Educational Resources Information Center

    Wood, Peter

    2011-01-01

    Independent learning is a buzz word that is often used in connection with computer technologies applied to the area of foreign language instruction. This chapter takes a critical look at some of the stereotypes that exist with regard to computer-assisted language learning (CALL) as a money saver and an easy way to create an "independent"…

  16. Screening tool to evaluate the vulnerability of down-gradient receptors to groundwater contaminants from uncapped landfills

    USGS Publications Warehouse

    Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony R.; Romanok, Kristin M.; Wengrowski, Edward W

    2015-01-01

    A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenico Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.

  17. International Space Station Centrifuge Rotor Models A Comparison of the Euler-Lagrange and the Bond Graph Modeling Approach

    NASA Technical Reports Server (NTRS)

    Nguyen, Louis H.; Ramakrishnan, Jayant; Granda, Jose J.

    2006-01-01

    The assembly and operation of the International Space Station (ISS) require extensive testing and engineering analysis to verify that the Space Station system of systems would work together without any adverse interactions. Since the dynamic behavior of an entire Space Station cannot be tested on earth, math models of the Space Station structures and mechanical systems have to be built and integrated in computer simulations and analysis tools to analyze and predict what will happen in space. The ISS Centrifuge Rotor (CR) is one of many mechanical systems that need to be modeled and analyzed to verify the ISS integrated system performance on-orbit. This study investigates using Bond Graph modeling techniques as quick and simplified ways to generate models of the ISS Centrifuge Rotor. This paper outlines the steps used to generate simple and more complex models of the CR using Bond Graph Computer Aided Modeling Program with Graphical Input (CAMP-G). Comparisons of the Bond Graph CR models with those derived from Euler-Lagrange equations in MATLAB and those developed using multibody dynamic simulation at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) are presented to demonstrate the usefulness of the Bond Graph modeling approach for aeronautics and space applications.

  18. Parametric Study of a Mixer/Ejector Nozzle with Mixing Enhancement Devices

    NASA Technical Reports Server (NTRS)

    DalBello, T.; Steffen, C. J., Jr.

    2001-01-01

    A numerical study employing a simplified model of the High Speed Civil Transport mixer/ejector nozzle has been conducted to investigate the effect of tabs (vortex generators) on the mixing process. More complete mixing of the primary and secondary flows within the confined ejector lowers peak exit velocity resulting in reduced jet noise. Tabs were modeled as vortex pairs and inserted into the computational model. The location, size, and number of tabs were varied and its effect on the mixing process is presented here both quantitatively and qualitatively. A baseline case (no tabs) along with six other cases involving two different vortex strengths at three different orientations have been computed and analyzed. The case with the highest vorticity (six vortices representing large tabs) gives the best mixing. It is shown that the influence of the vorticity acts primarily in the forward or middle portions of the duct, significantly alters the flow structure, and promotes some mixing in the lateral direction. Unmixed pockets were found at the top and bottom of the lobe, and more clever placement of tabs improved mixing in the vertical direction. The technique of replacing tabs with vortices shows promise as an efficient tool for quickly optimizing tab placement in lobed mixers.

  19. Personal digital assistant applications for the healthcare provider.

    PubMed

    Keplar, Kristine E; Urbanski, Christopher J

    2003-02-01

    To review some common medical applications available for personal digital assistants (PDAs), with brief discussion of the different PDA operating systems and memory requirements. Key search terms included handheld, PDA, personal digital assistants, and medical applications. The literature was accessed through MEDLINE (1999-August 2002). Other information was obtained through secondary sources such as Web sites describing common PDAs. Medical applications available on PDAs are numerous and include general drug references, specialized drug references (e.g., pediatrics, geriatrics, cardiology, infectious disease), diagnostic guides, medical calculators, herbal medication references, nursing references, toxicology references, and patient tracking databases. Costs and memory requirements for these programs can vary; consequently, the healthcare provider must limit the medication applications that are placed on the handheld computer. This article attempts to systematically describe the common medical applications available for the handheld computer along with cost, memory and download requirements, and Web site information. This review found many excellent PDA drug information applications offering many features which will aid the healthcare provider. Very likely, after using these PDA applications, the healthcare provider will find them indispensable, as their multifunctional capabilities can save time, improve accuracy, and allow for general business procedures as well as being a quick reference tool. To avoid the benefits of this technology might be a step backward.

  20. Transient Simulation of Speed-No Load Conditions With An Open-Source Based C++ Code

    NASA Astrophysics Data System (ADS)

    Casartelli, E.; Mangani, L.; Romanelli, G.; Staubli, T.

    2014-03-01

    Modern reversible pump-turbines can start in turbine operation very quickly, i.e. within few minutes. Unfortunately no clear design rules for runners with a stable start-up are available, so that certain machines can present unstable characteristics which lead to oscillations in the hydraulic system during synchronization. The so-called S-shape, i.e. the unstable characteristic in turbine brake operation, is defined by the change of sign of the slope of the head curve. In order to assess and understand this kind of instabilities with CFD, fast and reliable methods are needed. Using a 360 degrees model including the complete machine from spiral casing to draft tube the capabilities of a newly developed in-house tool are presented. An ad-hoc simulation is performed from no-load conditions into the S-shape in transient mode and using moving-mesh capabilities, thus being able to capture the opening process of the wicket gates, for example like during start-up. Beside the presentation of the computational methodology, various phenomena encounterd are analyzed and discussed, comparing them with measured and previously computed data, in order to show the capabilities of the developed procedure. Insight in detected phenomena is also given for global data like frequencies of vortical structures and local flow patterns.

  1. CFD Modelling of a Quadrupole Vortex Inside a Cylindrical Channel for Research into Advanced Hybrid Rocket Designs

    NASA Astrophysics Data System (ADS)

    Godfrey, B.; Majdalani, J.

    2014-11-01

    This study relies on computational fluid dynamics (CFD) tools to analyse a possible method for creating a stable quadrupole vortex within a simulated, circular-port, cylindrical rocket chamber. A model of the vortex generator is created in a SolidWorks CAD program and then the grid is generated using the Pointwise mesh generation software. The non-reactive flowfield is simulated using an open source computational program, Stanford University Unstructured (SU2). Subsequent analysis and visualization are performed using ParaView. The vortex generation approach that we employ consists of four tangentially injected monopole vortex generators that are arranged symmetrically with respect to the center of the chamber in such a way to produce a quadrupole vortex with a common downwash. The present investigation focuses on characterizing the flow dynamics so that future investigations can be undertaken with increasing levels of complexity. Our CFD simulations help to elucidate the onset of vortex filaments within the monopole tubes, and the evolution of quadrupole vortices downstream of the injection faceplate. Our results indicate that the quadrupole vortices produced using the present injection pattern can become quickly unstable to the extent of dissipating soon after being introduced into simulated rocket chamber. We conclude that a change in the geometrical configuration will be necessary to produce more stable quadrupoles.

  2. Tetrahedral Hohlraum Visualization and Pointings

    NASA Astrophysics Data System (ADS)

    Klare, K. A.; Wallace, J. M.; Drake, D.

    1997-11-01

    In designing experiments for Omega, the tetrahedral hohlraum (a sphere with four holes) can make full use of all 60 beams. There are some complications: the beams must clear the laser entrance hole (LEH), must miss a central capsule, absolutely must not go out the other LEHs, and should distribute in the interior of the hohlraum to maximize the uniformity of irradiation on the capsule while keeping reasonable laser spot sizes. We created a 15-offset coordinate system with which an IDL program computes clearances, writes a file for QuickDraw 3D (QD3D) visualization, and writes input for the viewfactor code RAYNA IV. Visualizing and adjusting the parameters by eye gave more reliable results than computer optimization. QD3D images permitted quick live rotations to determine offsets. The clearances obtained insured safe operation and good physics. The viewfactor code computes the initial irradiation of the hohlraum and capsule or of a uniform hohlraum source with the loss through the four LEHs and shows a high degree of uniformity with both, better for lasers because this deposits more energy near the LEHs to compensate for the holes.

  3. Bio-inspired approach to multistage image processing

    NASA Astrophysics Data System (ADS)

    Timchenko, Leonid I.; Pavlov, Sergii V.; Kokryatskaya, Natalia I.; Poplavska, Anna A.; Kobylyanska, Iryna M.; Burdenyuk, Iryna I.; Wójcik, Waldemar; Uvaysova, Svetlana; Orazbekov, Zhassulan; Kashaganova, Gulzhan

    2017-08-01

    Multistage integration of visual information in the brain allows people to respond quickly to most significant stimuli while preserving the ability to recognize small details in the image. Implementation of this principle in technical systems can lead to more efficient processing procedures. The multistage approach to image processing, described in this paper, comprises main types of cortical multistage convergence. One of these types occurs within each visual pathway and the other between the pathways. This approach maps input images into a flexible hierarchy which reflects the complexity of the image data. The procedures of temporal image decomposition and hierarchy formation are described in mathematical terms. The multistage system highlights spatial regularities, which are passed through a number of transformational levels to generate a coded representation of the image which encapsulates, in a computer manner, structure on different hierarchical levels in the image. At each processing stage a single output result is computed to allow a very quick response from the system. The result is represented as an activity pattern, which can be compared with previously computed patterns on the basis of the closest match.

  4. SU-F-BRB-07: A Plan Comparison Tool to Ensure Robustness and Deliverability in Online-Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, P; Labby, Z; Bayliss, R A

    Purpose: To develop a plan comparison tool that will ensure robustness and deliverability through analysis of baseline and online-adaptive radiotherapy plans using similarity metrics. Methods: The ViewRay MRIdian treatment planning system allows export of a plan file that contains plan and delivery information. A software tool was developed to read and compare two plans, providing information and metrics to assess their similarity. In addition to performing direct comparisons (e.g. demographics, ROI volumes, number of segments, total beam-on time), the tool computes and presents histograms of derived metrics (e.g. step-and-shoot segment field sizes, segment average leaf gaps). Such metrics were investigatedmore » for their ability to predict that an online-adapted plan reasonably similar to a baseline plan where deliverability has already been established. Results: In the realm of online-adaptive planning, comparing ROI volumes offers a sanity check to verify observations found during contouring. Beyond ROI analysis, it has been found that simply editing contours and re-optimizing to adapt treatment can produce a delivery that is substantially different than the baseline plan (e.g. number of segments increased by 31%), with no changes in optimization parameters and only minor changes in anatomy. Currently the tool can quickly identify large omissions or deviations from baseline expectations. As our online-adaptive patient population increases, we will continue to develop and refine quantitative acceptance criteria for adapted plans and relate them historical delivery QA measurements. Conclusion: The plan comparison tool is in clinical use and reports a wide range of comparison metrics, illustrating key differences between two plans. This independent check is accomplished in seconds and can be performed in parallel to other tasks in the online-adaptive workflow. Current use prevents large planning or delivery errors from occurring, and ongoing refinements will lead to increased assurance of plan quality.« less

  5. The VLab repository of thermodynamics and thermoelastic properties of minerals

    NASA Astrophysics Data System (ADS)

    Da Silveira, P. R.; Sarkar, K.; Wentzcovitch, R. M.; Shukla, G.; Lindemann, W.; Wu, Z.

    2015-12-01

    Thermodynamics and thermoelastic properties of minerals at planetary interior conditions are essential as input for geodynamics simulations and for interpretation of seismic tomography models. Precise experimental determination of these properties at such extreme conditions is very challenging. Therefore, ab initio calculations play an essential role in this context, but at the cost of great computational effort and memory use. Setting up a widely accessible and versatile mineral physics database can relax unnecessary repetition of such computationally intensive calculations. Access to such data facilitates transactional interaction across fields and can advance more quickly insights about deep Earth processes. Hosted by the Minnesota Supercomputing Institute, the Virtual Laboratory for Earth and Planetary Materials (VLab) was designed to develop and promote the theory of planetary materials using distributed, high-throughput quantum calculations. VLab hosts an interactive database of thermodynamics and thermoelastic properties or minerals computed by ab initio. Such properties can be obtained according to user's preference. The database is accompanied by interactive visualization tools, allowing users to repeat and build upon previously published results. Using VLab2015, we have evaluated thermoelastic properties, such as elastic coefficients (Cij), Voigt, Reuss, and Voigt-Reuss-Hill aggregate averages for bulk (K) and shear modulus (G), shear wave velocity (VS), longitudinal wave velocity (Vp), and bulk sound velocity (V0) for several important minerals. Developed web services are general and can be used for crystals of any symmetry. Results can be tabulated, plotted, or downloaded from the VLab website according to user's preference.

  6. Rapid Assessment for Conservation Education (RACE).

    ERIC Educational Resources Information Center

    Jacobson, Susan K.

    1997-01-01

    The Rapid Assessment for Conservation Education (RACE) provides guidelines and tools for identifying conservation education needs and recommending subsequent actions. Interviews, surveys, workshops, and public meetings are among the methods of qualitative and quantitative data collection used to gather information quickly and inexpensively.…

  7. Centralized Accounting and Electronic Filing Provides Efficient Receivables Collection.

    ERIC Educational Resources Information Center

    School Business Affairs, 1983

    1983-01-01

    An electronic filing system makes financial control manageable at Bowling Green State University, Ohio. The system enables quick access to computer-stored consolidated account data and microfilm images of charges, statements, and other billing documents. (MLF)

  8. Extending NASA Research Results to Benefit Society: Rapid Prototyping for Coastal Applications

    NASA Technical Reports Server (NTRS)

    Glorioso, Mark V.; Miller, Richard L.; Hall, Callie M.; McPherson, Terry R.

    2006-01-01

    The mission of the NASA Applied Sciences Program is to expand and accelerate the use of NASA research results to benefit society in 12 application areas of national priority. ONe of the program's major challenges is to perform a quick, efficient, and detailed review (i.e., prototyping) of the large number of combinations of NASA observations and results from Earth system models that may be used by a wide range of decision support tools. A Rapid Prototyping Capacity (RPC) is being developed to accelerate the use of NASA research results. Here, we present the conceptual framework of the Rapid Prototyping Capacity within the context of quickly assessing the efficacy of NASA research results and technologies to support the Coastal Management application. An initial RPC project designed to quickly evaluate the utility of moderate-resolution MODIS products for calibrating/validating coastal sediment transport models is also presented.

  9. Development of an Immunochromatography Assay (QuickNavi-Ebola) to Detect Multiple Species of Ebolaviruses

    PubMed Central

    Yoshida, Reiko; Muramatsu, Shino; Akita, Hiroshi; Saito, Yuji; Kuwahara, Miwa; Kato, Daisuke; Changula, Katendi; Miyamoto, Hiroko; Kajihara, Masahiro; Manzoor, Rashid; Furuyama, Wakako; Marzi, Andrea; Feldmann, Heinz; Mweene, Aaron; Masumu, Justin; Kapeteshi, Jimmy; Muyembe-Tamfum, Jean-Jacques; Takada, Ayato

    2016-01-01

    The latest outbreak of Ebola virus disease (EVD) in West Africa has highlighted the urgent need for the development of rapid and reliable diagnostic assays. We used monoclonal antibodies specific to the ebolavirus nucleoprotein to develop an immunochromatography (IC) assay (QuickNavi-Ebola) for rapid diagnosis of EVD. The IC assay was first evaluated with tissue culture supernatants of infected Vero E6 cells and found to be capable of detecting 103–104 focus-forming units/mL of ebolaviruses. Using serum samples from experimentally infected nonhuman primates, we confirmed that the assay could detect the viral antigen shortly after disease onset. It was also noted that multiple species of ebolaviruses could be detected by the IC assay. Owing to the simplicity of the assay procedure and absence of requirements for special equipment and training, QuickNavi-Ebola is expected to be a useful tool for rapid diagnosis of EVD. PMID:27462094

  10. HEALTH GeoJunction: place-time-concept browsing of health publications.

    PubMed

    MacEachren, Alan M; Stryker, Michael S; Turton, Ian J; Pezanowski, Scott

    2010-05-18

    The volume of health science publications is escalating rapidly. Thus, keeping up with developments is becoming harder as is the task of finding important cross-domain connections. When geographic location is a relevant component of research reported in publications, these tasks are more difficult because standard search and indexing facilities have limited or no ability to identify geographic foci in documents. This paper introduces HEALTH GeoJunction, a web application that supports researchers in the task of quickly finding scientific publications that are relevant geographically and temporally as well as thematically. HEALTH GeoJunction is a geovisual analytics-enabled web application providing: (a) web services using computational reasoning methods to extract place-time-concept information from bibliographic data for documents and (b) visually-enabled place-time-concept query, filtering, and contextualizing tools that apply to both the documents and their extracted content. This paper focuses specifically on strategies for visually-enabled, iterative, facet-like, place-time-concept filtering that allows analysts to quickly drill down to scientific findings of interest in PubMed abstracts and to explore relations among abstracts and extracted concepts in place and time. The approach enables analysts to: find publications without knowing all relevant query parameters, recognize unanticipated geographic relations within and among documents in multiple health domains, identify the thematic emphasis of research targeting particular places, notice changes in concepts over time, and notice changes in places where concepts are emphasized. PubMed is a database of over 19 million biomedical abstracts and citations maintained by the National Center for Biotechnology Information; achieving quick filtering is an important contribution due to the database size. Including geography in filters is important due to rapidly escalating attention to geographic factors in public health. The implementation of mechanisms for iterative place-time-concept filtering makes it possible to narrow searches efficiently and quickly from thousands of documents to a small subset that meet place-time-concept constraints. Support for a more-like-this query creates the potential to identify unexpected connections across diverse areas of research. Multi-view visualization methods support understanding of the place, time, and concept components of document collections and enable comparison of filtered query results to the full set of publications.

  11. PFAAT version 2.0: a tool for editing, annotating, and analyzing multiple sequence alignments.

    PubMed

    Caffrey, Daniel R; Dana, Paul H; Mathur, Vidhya; Ocano, Marco; Hong, Eun-Jong; Wang, Yaoyu E; Somaroo, Shyamal; Caffrey, Brian E; Potluri, Shobha; Huang, Enoch S

    2007-10-11

    By virtue of their shared ancestry, homologous sequences are similar in their structure and function. Consequently, multiple sequence alignments are routinely used to identify trends that relate to function. This type of analysis is particularly productive when it is combined with structural and phylogenetic analysis. Here we describe the release of PFAAT version 2.0, a tool for editing, analyzing, and annotating multiple sequence alignments. Support for multiple annotations is a key component of this release as it provides a framework for most of the new functionalities. The sequence annotations are accessible from the alignment and tree, where they are typically used to label sequences or hyperlink them to related databases. Sequence annotations can be created manually or extracted automatically from UniProt entries. Once a multiple sequence alignment is populated with sequence annotations, sequences can be easily selected and sorted through a sophisticated search dialog. The selected sequences can be further analyzed using statistical methods that explicitly model relationships between the sequence annotations and residue properties. Residue annotations are accessible from the alignment viewer and are typically used to designate binding sites or properties for a particular residue. Residue annotations are also searchable, and allow one to quickly select alignment columns for further sequence analysis, e.g. computing percent identities. Other features include: novel algorithms to compute sequence conservation, mapping conservation scores to a 3D structure in Jmol, displaying secondary structure elements, and sorting sequences by residue composition. PFAAT provides a framework whereby end-users can specify knowledge for a protein family in the form of annotation. The annotations can be combined with sophisticated analysis to test hypothesis that relate to sequence, structure and function.

  12. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horsey, Henry; Fleming, Katherine; Ball, Brian

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is calledmore » metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.« less

  13. Memory Benchmarks for SMP-Based High Performance Parallel Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, A B; de Supinski, B; Mueller, F

    2001-11-20

    As the speed gap between CPU and main memory continues to grow, memory accesses increasingly dominates the performance of many applications. The problem is particularly acute for symmetric multiprocessor (SMP) systems, where the shared memory may be accessed concurrently by a group of threads running on separate CPUs. Unfortunately, several key issues governing memory system performance in current systems are not well understood. Complex interactions between the levels of the memory hierarchy, buses or switches, DRAM back-ends, system software, and application access patterns can make it difficult to pinpoint bottlenecks and determine appropriate optimizations, and the situation is even moremore » complex for SMP systems. To partially address this problem, we formulated a set of multi-threaded microbenchmarks for characterizing and measuring the performance of the underlying memory system in SMP-based high-performance computers. We report our use of these microbenchmarks on two important SMP-based machines. This paper has four primary contributions. First, we introduce a microbenchmark suite to systematically assess and compare the performance of different levels in SMP memory hierarchies. Second, we present a new tool based on hardware performance monitors to determine a wide array of memory system characteristics, such as cache sizes, quickly and easily; by using this tool, memory performance studies can be targeted to the full spectrum of performance regimes with many fewer data points than is otherwise required. Third, we present experimental results indicating that the performance of applications with large memory footprints remains largely constrained by memory. Fourth, we demonstrate that thread-level parallelism further degrades memory performance, even for the latest SMPs with hardware prefetching and switch-based memory interconnects.« less

  14. [Clinical skills and outcomes of chair-side computer aided design and computer aided manufacture system].

    PubMed

    Yu, Q

    2018-04-09

    Computer aided design and computer aided manufacture (CAD/CAM) technology is a kind of oral digital system which is applied to clinical diagnosis and treatment. It overturns the traditional pattern, and provides a solution to restore defect tooth quickly and efficiently. In this paper we mainly discuss the clinical skills of chair-side CAD/CAM system, including tooth preparation, digital impression, the three-dimensional design of prosthesis, numerical control machining, clinical bonding and so on, and review the outcomes of several common kinds of materials at the same time.

  15. Explorative visual analytics on interval-based genomic data and their metadata.

    PubMed

    Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano

    2017-12-04

    With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.

  16. CrossQuery: a web tool for easy associative querying of transcriptome data.

    PubMed

    Wagner, Toni U; Fischer, Andreas; Thoma, Eva C; Schartl, Manfred

    2011-01-01

    Enormous amounts of data are being generated by modern methods such as transcriptome or exome sequencing and microarray profiling. Primary analyses such as quality control, normalization, statistics and mapping are highly complex and need to be performed by specialists. Thereafter, results are handed back to biomedical researchers, who are then confronted with complicated data lists. For rather simple tasks like data filtering, sorting and cross-association there is a need for new tools which can be used by non-specialists. Here, we describe CrossQuery, a web tool that enables straight forward, simple syntax queries to be executed on transcriptome sequencing and microarray datasets. We provide deep-sequencing data sets of stem cell lines derived from the model fish Medaka and microarray data of human endothelial cells. In the example datasets provided, mRNA expression levels, gene, transcript and sample identification numbers, GO-terms and gene descriptions can be freely correlated, filtered and sorted. Queries can be saved for later reuse and results can be exported to standard formats that allow copy-and-paste to all widespread data visualization tools such as Microsoft Excel. CrossQuery enables researchers to quickly and freely work with transcriptome and microarray data sets requiring only minimal computer skills. Furthermore, CrossQuery allows growing association of multiple datasets as long as at least one common point of correlated information, such as transcript identification numbers or GO-terms, is shared between samples. For advanced users, the object-oriented plug-in and event-driven code design of both server-side and client-side scripts allow easy addition of new features, data sources and data types.

  17. Spinoff 2013

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.

  18. APT: Aperture Photometry Tool

    NASA Astrophysics Data System (ADS)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  19. [Software for performing a global phenotypic and genotypic nutritional assessment].

    PubMed

    García de Diego, L; Cuervo, M; Martínez, J A

    2013-01-01

    The nutritional assessment of a patient needs the simultaneous managing a extensive information and a great number of databases, as both aspects of the process of nutrition and the clinical situation of the patient are analyzed. The introduction of computers in the nutritional area constitutes an extraordinary advance in the administration of nutrition information, providing a complete assessment of nutritional aspects in a quick and easy way. To develop a computer program that can be used as a tool for assessing the nutritional status of the patient, the education of clinical staff, for epidemiological studies and for educational purposes. Based on a computer program which assists the health specialist to perform a full nutritional evaluation of the patient, through the registration and assessment of the phenotypic and genotypic features. The application provides nutritional prognosis based on anthropometric and biochemical parameters, images of states of malnutrition, questionnaires to characterize diseases, diagnostic criteria, identification of alleles associated with the development of specific metabolic illnesses and questionnaires of quality of life, for a custom actuation. The program includes, as part of the nutritional assessment of the patient, food intake analysis, design of diets and promotion of physical activity, introducing food frequency questionnaires, dietary recalls, healthy eating indexes, model diets, fitness tests, and recommendations, recalls and questionnaires of physical activity. A computer program performed under Java Swing, using SQLite database and some external libraries such as JfreeChart for plotting graphs. This brand new designed software is composed of five blocks categorized into ten modules named: Patients, Anthropometry, Clinical History, Biochemistry, Dietary History, Diagnostic (with genetic make up), Quality of life, Physical activity, Energy expenditure and Diets. Each module has a specific function which evaluates a different aspect of the nutritional status of the patient. UNyDIET is a global computer program, customized and upgradeable, easy to use and versatile, aimed to health specialists, medical staff, dietitians, nutritionists, scientists and educators. This tool can be used as a working instrument in programs promoting health, nutritional and clinical assessments as well as in the evaluation of health care quality, in epidemiological studies, in nutrition intervention programs and teaching. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  20. Proposal for the design of a zero gravity tool storage device

    NASA Technical Reports Server (NTRS)

    Stuckwisch, Sue; Carrion, Carlos A.; Phillips, Lee; Laughlin, Julia; Francois, Jason

    1994-01-01

    Astronauts frequently use a variety of hand tools during space missions, especially on repair missions. A toolbox is needed to allow storage and retrieval of tools with minimal difficulties. The toolbox must contain tools during launch, landing, and on-orbit operations. The toolbox will be used in the Shuttle Bay and therefore must withstand the hazardous space environment. The three main functions of the toolbox in space are: to protect the tools from the space environment and from damaging one another, to allow for quick, one-handed access to the tools; and to minimize the heat transfer between the astronaut's hand and the tools. This proposal explores the primary design issues associated with the design of the toolbox. Included are the customer and design specifications, global and refined function structures, possible solution principles, concept variants, and finally design recommendations.

Top