Sample records for basic software tools

  1. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    ERIC Educational Resources Information Center

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  2. Virtual Immunology: Software for Teaching Basic Immunology

    ERIC Educational Resources Information Center

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available…

  3. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)

  4. BH-ShaDe: A Software Tool That Assists Architecture Students in the III-Structured Task of Housing Design

    ERIC Educational Resources Information Center

    Millan, Eva; Belmonte, Maria-Victoria; Ruiz-Montiel, Manuela; Gavilanes, Juan; Perez-de-la-Cruz, Jose-Luis

    2016-01-01

    In this paper, we present BH-ShaDe, a new software tool to assist architecture students learning the ill-structured domain/task of housing design. The software tool provides students with automatic or interactively generated floor plan schemas for basic houses. The students can then use the generated schemas as initial seeds to develop complete…

  5. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    PubMed Central

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-01-01

    Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806

  6. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    PubMed

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  7. Proofreading using an assistive software homophone tool: compensatory and remedial effects on the literacy skills of students with reading difficulties.

    PubMed

    Lange, Alissa A; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones highlighted only, or with no help. The group using the homophone tool significantly outperformed the other two groups on assisted proofreading and outperformed the others on unassisted spelling, although not significantly. Remedial (unassisted) improvements in automaticity of word recognition, homophone proofreading, and basic reading were found over all groups. Results elucidate the differential contributions of each function of the homophone tool and suggest that with the proper training, assistive software can help not only students with diagnosed disabilities but also those with generally weak reading skills.

  8. Index of Workplace & Adult Basic Skills Software.

    ERIC Educational Resources Information Center

    Askov, Eunice N.; Clark, Cindy Jo

    This index of workplace and adult basic skills computer software includes 108 listings. Each listing is described according to the following classifications: (1) teacher/tutor tools (customizable or mini-authoring systems); (2) assessment and skills; (3) content; (4) instruction method; (5) system requirements; and (6) name, address, and phone…

  9. Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software

    PubMed Central

    Williams, Linda; Grayson, Diana; Gosbee, John

    2001-01-01

    Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.

  10. Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software

    PubMed Central

    Williams, Linda; Grayson, Diana; Gosbee, John

    2002-01-01

    Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.

  11. Computer Mathematical Tools: Practical Experience of Learning to Use Them

    ERIC Educational Resources Information Center

    Semenikhina, Elena; Drushlyak, Marina

    2014-01-01

    The article contains general information about the use of specialized mathematics software in the preparation of math teachers. The authors indicate the reasons to study the mathematics software. In particular, they analyze the possibility of presenting basic mathematical courses using mathematical computer tools from both a teacher and a student,…

  12. General-Purpose Electronic System Tests Aircraft

    NASA Technical Reports Server (NTRS)

    Glover, Richard D.

    1989-01-01

    Versatile digital equipment supports research, development, and maintenance. Extended aircraft interrogation and display system is general-purpose assembly of digital electronic equipment on ground for testing of digital electronic systems on advanced aircraft. Many advanced features, including multiple 16-bit microprocessors, pipeline data-flow architecture, advanced operating system, and resident software-development tools. Basic collection of software includes program for handling many types of data and for displays in various formats. User easily extends basic software library. Hardware and software interfaces to subsystems provided by user designed for flexibility in configuration to meet user's requirements.

  13. BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device

    NASA Astrophysics Data System (ADS)

    Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.

    2010-12-01

    The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.

  14. Space Shuttle Software Development and Certification

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Henderson, Johnnie A

    2000-01-01

    Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools

  15. Geneious Basic: An integrated and extendable desktop software platform for the organization and analysis of sequence data

    PubMed Central

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-01-01

    Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367

  16. Geneious Basic: an integrated and extendable desktop software platform for the organization and analysis of sequence data.

    PubMed

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-06-15

    The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.

  17. Use of Cloud-Based Graphic Narrative Software in Medical Ethics Teaching

    ERIC Educational Resources Information Center

    Weber, Alan S.

    2015-01-01

    Although used as a common pedagogical tool in K-12 education, online graphic narrative ("comics") software has not generally been incorporated into advanced professional or technical education. This contribution reports preliminary data from a study on the use of cloud-based graphics software Pixton.com to teach basic medical ethics…

  18. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  19. Basic analysis of reflectometry data software package for the analysis of multilayered structures according to reflectometry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.

    2012-01-15

    The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.

  20. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  1. The LHCb Starterkit

    NASA Astrophysics Data System (ADS)

    Puig, Albert; LHCb Starterkit Team

    2017-10-01

    The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired “on the go” and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The course focuses on teaching basic skills for research computing. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by two young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as an advance one, have taken place since the start of the initiative in 2015, and were taught largely by PhD students to other PhD students.

  2. The evolution of CMS software performance studies

    NASA Astrophysics Data System (ADS)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  3. IDSE Version 1 User's Manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.

  4. compomics-utilities: an open-source Java library for computational proteomics.

    PubMed

    Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart

    2011-03-08

    The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.

  5. CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000

    DTIC Science & Technology

    2000-06-01

    Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S

  6. Educational Software for First Order Logic Semantics in Introductory Logic Courses

    ERIC Educational Resources Information Center

    Mauco, María Virginia; Ferrante, Enzo; Felice, Laura

    2014-01-01

    Basic courses on logic are common in most computer science curricula. Students often have difficulties in handling formalisms and getting familiar with them. Educational software helps to motivate and improve the teaching-learning processes. Therefore, incorporating these kinds of tools becomes important, because they contribute to gaining…

  7. PHARMAVIRTUA: Educational Software for Teaching and Learning Basic Pharmacology

    ERIC Educational Resources Information Center

    Fidalgo-Neto, Antonio Augusto; Alberto, Anael Viana Pinto; Bonavita, André Gustavo Calvano; Bezerra, Rômulo José Soares; Berçot, Felipe Faria; Lopes, Renato Matos; Alves, Luiz Anastacio

    2014-01-01

    Information and communication technologies have become important tools for teaching scientific subjects such as anatomy and histology as well as other, nondescriptive subjects like physiology and pharmacology. Software has been used to facilitate the learning of specific concepts at the cellular and molecular levels in the biological and health…

  8. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  9. A generic testbed for the design of plasma spectrometer control software with application to the THOR-CSW solar wind instrument

    NASA Astrophysics Data System (ADS)

    De Keyser, Johan; Lavraud, Benoit; Neefs, Eddy; Berkenbosch, Sophie; Beeckman, Bram; Maggiolo, Romain; Gamby, Emmanuel; Fedorov, Andrei; Baruah, Rituparna; Wong, King-Wah; Amoros, Carine; Mathon, Romain; Génot, Vincent; Marcucci, Federica; Brienza, Daniele

    2017-04-01

    Modern plasma spectrometers require intelligent software that is able to exploit their capabilities to the fullest. While the low-level control of the instrument and basic tasks such as performing the basic measurement, temperature control, and production of housekeeping data are to be done by software that is executed on an FPGA and/or processor inside the instrument, higher level tasks such as control of measurement sequences, on-board moment calculation, beam tracking decisions, and data compression, may be performed by the instrument or in the payload data processing unit. Such design decisions, as well as an assessment of the workload on the different processing components, require early prototyping. We have developed a generic simulation testbed for the design of plasma spectrometer control software that allows an early evaluation of the level of resources that is needed at each level. Early prototyping can pinpoint bottlenecks in the design allowing timely remediation. We have applied this tool to the THOR Cold Solar Wind (CSW) plasma spectrometer. Some examples illustrating the usefulness of the tool are given.

  10. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1993-01-01

    Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.

  11. Cognitive Foundry v. 3.0 (OSS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basilico, Justin; Dixon, Kevin; McClain, Jonathan

    2009-11-18

    The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less

  12. Tools and Methods for Teaching Informatics at School: An Advanced Logo Course.

    ERIC Educational Resources Information Center

    Nikolov, Rumen

    1992-01-01

    Describes a course in educational informatics for preservice teachers and students in educational software development that emphasizes the use of LOGO, and summarizes course modules that cover tools and methods for teaching informatics, informatics curriculum design, introducing the basic notions of informatics, integrating informatics into the…

  13. Fluctuating Finite Element Analysis (FFEA): A continuum mechanics software tool for mesoscale simulation of biomolecules.

    PubMed

    Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A

    2018-03-01

    Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.

  14. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume I.

    DTIC Science & Technology

    1981-04-30

    However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to

  15. WalkThrough Example Procedures for MAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph

    This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.

  16. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  17. The Complexity Analysis Tool

    DTIC Science & Technology

    1988-10-01

    overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and

  18. The Development and Evaluation of Software to Foster Professional Development in Educational Assessment

    ERIC Educational Resources Information Center

    Benton, Morgan C.

    2008-01-01

    This dissertation sought to answer the question: Is it possible to build a software tool that will allow teachers to write better multiple-choice questions? The thesis proceeded from the finding that the quality of teaching is very influential in the amount that students learn. A basic premise of this research, then, is that improving teachers…

  19. A software technology evaluation program

    NASA Technical Reports Server (NTRS)

    Novaes-Card, David N.

    1985-01-01

    A set of quantitative approaches is presented for evaluating software development methods and tools. The basic idea is to generate a set of goals which are refined into quantifiable questions which specify metrics to be collected on the software development and maintenance process and product. These metrics can be used to characterize, evaluate, predict, and motivate. They can be used in an active as well as passive way by learning form analyzing the data and improving the methods and tools based upon what is learned from that analysis. Several examples were given representing each of the different approaches to evaluation. The cost of the approaches varied inversely with the level of confidence in the interpretation of the results.

  20. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  1. Fluctuating Finite Element Analysis (FFEA): A continuum mechanics software tool for mesoscale simulation of biomolecules

    PubMed Central

    Solernou, Albert

    2018-01-01

    Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package. PMID:29570700

  2. LevRad software as a tool to learn how to proceed with an evaluation of barriers.

    PubMed

    Ferreira, C C; Souza, S O

    2011-05-30

    We developed the software LevRad with the objective of teaching how to proceed in an analysis of barriers shielding against x-rays to minimize the contact of the professional or the student with x-rays and also to prevent wearing out of the x-ray equipment. Some tests of the software were made, and preliminary results indicate that LevRad is efficient as a complementary tool for the development of professionals related to diagnostic radiology. In the case of education, an advantage is gained when the beginner uses the software before his or her first contact with x-ray equipment in locu. The software introduces a basic knowledge about evaluation of barriers, prevents wearing out of the x—ray tube, reinforces teaching of evaluation of barriers, and reduces the collective effective dose by avoiding unnecessary exposures when possible.

  3. [Analysis of software for identifying spectral line of laser-induced breakdown spectroscopy based on LabVIEW].

    PubMed

    Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang

    2012-03-01

    Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.

  4. Software-Based Visual Loan Calculator For Banking Industry

    NASA Astrophysics Data System (ADS)

    Isizoh, A. N.; Anazia, A. E.; Okide, S. O. 3; Onyeyili, T. I.; Okwaraoka, C. A. P.

    2012-03-01

    industry is very necessary in modern day banking system using many design techniques for security reasons. This paper thus presents the software-based design and implementation of a Visual Loan calculator for banking industry using Visual Basic .Net (VB.Net). The fundamental approach to this is to develop a Graphical User Interface (GUI) using VB.Net operating tools, and then developing a working program which calculates the interest of any loan obtained. The VB.Net programming was done, implemented and the software proved satisfactory.

  5. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  6. Software in the Classroom: Issues in the Design of Effective Software Tools. Technical Report No. 15.

    ERIC Educational Resources Information Center

    Kurland, D. Midian

    This paper identifies three ways that computers are used in educational contexts. The first and most widespread use is as a tutor, i.e., as a delivery system for programmed instruction and drill-and-practice activities. The second use is as a programming environment to teach programming languages such as BASIC, LOGO, or PASCAL. The third use is as…

  7. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    NASA Technical Reports Server (NTRS)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  8. MAMA User Guide v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid

    Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less

  9. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1976-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.

  10. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  11. ControlShell: A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Chen, Vincent W.; Pardo-Castellote, Gerardo

    1994-01-01

    The ControlShell system is a programming environment that enables the development and implementation of complex real-time software. It includes many building tools for complex systems, such as a graphical finite state machine (FSM) tool to provide strategic control. ControlShell has a component-based design, providing interface definitions and mechanisms for building real-time code modules along with providing basic data management. Some of the system-building tools incorporated in ControlShell are a graphical data flow editor, a component data requirement editor, and a state-machine editor. It also includes a distributed data flow package, an execution configuration manager, a matrix package, and an object database and dynamic binding facility. This paper presents an overview of ControlShell's architecture and examines the functions of several of its tools.

  12. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  13. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  14. Data Standards for Flow Cytometry

    PubMed Central

    SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.

    2009-01-01

    Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228

  15. Setting Up Git Software Tool on Linux | High-Performance Computing | NREL

    Science.gov Websites

    system. Before you can get started using the github.nrel.gov git repos, you'll have to do some basic shell (SSH) keys created on those systems. If this is the case, for more information, see using the git Steps - Using a Remote Git Repository Now you have all the basic configuration for using git with a

  16. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  18. Matlab-Excel Interface for OpenDSS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The software allows users of the OpenDSS grid modeling software to access their load flow models using a GUI interface developed in MATLAB. The circuit definitions are entered into a Microsoft Excel spreadsheet which makes circuit creation and editing a much simpler process than the basic text-based editors used in the native OpenDSS interface. Plot tools have been developed which can be accessed through a MATLAB GUI once the desired parameters have been simulated.

  19. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  20. Freely available compound data sets and software tools for chemoinformatics and computational medicinal chemistry applications

    PubMed Central

    Bajorath, Jurgen

    2012-01-01

    We have generated a number of  compound data sets and programs for different types of applications in pharmaceutical research. These data sets and programs were originally designed for our research projects and are made publicly available. Without consulting original literature sources, it is difficult to understand specific features of data sets and software tools, basic ideas underlying their design, and applicability domains. Currently, 30 different entries are available for download from our website. In this data article, we provide an overview of the data and tools we make available and designate the areas of research for which they should be useful. For selected data sets and methods/programs, detailed descriptions are given. This article should help interested readers to select data and tools for specific computational investigations. PMID:24358818

  1. FoilSim: Basic Aerodynamics Software Created

    NASA Technical Reports Server (NTRS)

    Peterson, Ruth A.

    1999-01-01

    FoilSim is interactive software that simulates the airflow around various shapes of airfoils. The graphical user interface, which looks more like a video game than a learning tool, captures and holds the students interest. The software is a product of NASA Lewis Research Center s Learning Technologies Project, an educational outreach initiative within the High Performance Computing and Communications Program (HPCCP).This airfoil view panel is a simulated view of a wing being tested in a wind tunnel. As students create new wing shapes by moving slider controls that change parameters, the software calculates their lift. FoilSim also displays plots of pressure or airspeed above and below the airfoil surface.

  2. Virtual immunology: software for teaching basic immunology.

    PubMed

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available free of charge in Portuguese and English, which can be used by teachers and students in physiology, immunology, and cellular biology classes. We discuss the development of the initial two modules: "Organs and Lymphoid Tissues" and "Inflammation" and the use of interactive activities to provide microscopic and macroscopic understanding in immunology. Students, both graduate and undergraduate, were questioned along with university level professors about the quality of the software and intuitiveness of use, facility of navigation, and aesthetic organization using a Likert scale. An overwhelmingly satisfactory result was obtained with both students and immunology teachers. Programs such as "Virtual Immunology" are offering more interactive, multimedia approaches to complex scientific principles that increase student motivation, interest, and comprehension. © 2013 by The International Union of Biochemistry and Molecular Biology.

  3. AdaNET phase 0 support for the AdaNET Dynamic Software Inventory (DSI) management system prototype. Catalog of available reusable software components

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.

  4. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  5. Using the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  6. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  7. A simple tool for stereological assessment of digital images: the STEPanizer.

    PubMed

    Tschanz, S A; Burri, P H; Weibel, E R

    2011-07-01

    STEPanizer is an easy-to-use computer-based software tool for the stereological assessment of digitally captured images from all kinds of microscopical (LM, TEM, LSM) and macroscopical (radiology, tomography) imaging modalities. The program design focuses on providing the user a defined workflow adapted to most basic stereological tasks. The software is compact, that is user friendly without being bulky. STEPanizer comprises the creation of test systems, the appropriate display of digital images with superimposed test systems, a scaling facility, a counting module and an export function for the transfer of results to spreadsheet programs. Here we describe the major workflow of the tool illustrating the application on two examples from transmission electron microscopy and light microscopy, respectively. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  8. More emotional facial expressions during episodic than during semantic autobiographical retrieval.

    PubMed

    El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis

    2016-04-01

    There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions (i.e., happy facial expression for memories cued by happy). Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.

  9. Reviews of Selected System and Software Tools for Strategic Defense Applications

    DTIC Science & Technology

    1990-02-01

    Interleaf and FrameMaker . IStatic Diagnostics Basic testing includes validating flows, detecting orphan activity, and checking completeness of activities...Publisher, Aldus PageMaker, Unix pic, Apple .pict metafile, Interleaf, Framemaker , or Postscript format. There are no forms for standard documents such as 3

  10. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... 2010 version of the Causal Analysis/Diagnosis Decision Information System (CADDIS). This Web site was... methods; information on basic and advanced data analyses; downloadable software tools; and an online... ENVIRONMENTAL PROTECTION AGENCY [FRL-9206-7] 2010 Release of CADDIS (Causal Analysis/Diagnosis...

  11. Robotics for Computer Scientists: What's the Big Idea?

    ERIC Educational Resources Information Center

    Touretzky, David S.

    2013-01-01

    Modern robots, like today's smartphones, are complex devices with intricate software systems. Introductory robot programming courses must evolve to reflect this reality, by teaching students to make use of the sophisticated tools their robots provide rather than reimplementing basic algorithms. This paper focuses on teaching with Tekkotsu, an open…

  12. Fuzzy Logic Engine

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna

    2005-01-01

    The Fuzzy Logic Engine is a software package that enables users to embed fuzzy-logic modules into their application programs. Fuzzy logic is useful as a means of formulating human expert knowledge and translating it into software to solve problems. Fuzzy logic provides flexibility for modeling relationships between input and output information and is distinguished by its robustness with respect to noise and variations in system parameters. In addition, linguistic fuzzy sets and conditional statements allow systems to make decisions based on imprecise and incomplete information. The user of the Fuzzy Logic Engine need not be an expert in fuzzy logic: it suffices to have a basic understanding of how linguistic rules can be applied to the user's problem. The Fuzzy Logic Engine is divided into two modules: (1) a graphical-interface software tool for creating linguistic fuzzy sets and conditional statements and (2) a fuzzy-logic software library for embedding fuzzy processing capability into current application programs. The graphical- interface tool was developed using the Tcl/Tk programming language. The fuzzy-logic software library was written in the C programming language.

  13. The probability estimation of the electronic lesson implementation taking into account software reliability

    NASA Astrophysics Data System (ADS)

    Gurov, V. V.

    2017-01-01

    Software tools for educational purposes, such as e-lessons, computer-based testing system, from the point of view of reliability, have a number of features. The main ones among them are the need to ensure a sufficiently high probability of their faultless operation for a specified time, as well as the impossibility of their rapid recovery by the way of replacing it with a similar running program during the classes. The article considers the peculiarities of reliability evaluation of programs in contrast to assessments of hardware reliability. The basic requirements to reliability of software used for carrying out practical and laboratory classes in the form of computer-based training programs are given. The essential requirements applicable to the reliability of software used for conducting the practical and laboratory studies in the form of computer-based teaching programs are also described. The mathematical tool based on Markov chains, which allows to determine the degree of debugging of the training program for use in the educational process by means of applying the graph of the software modules interaction, is presented.

  14. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    PubMed

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  15. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  17. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  18. Adaptable Computing Environment/Self-Assembling Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software,more » able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less

  19. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  20. BioContainers: an open-source and community-driven framework for software standardization

    PubMed Central

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  1. Warning: Projects May Be Closer than They Appear

    NASA Technical Reports Server (NTRS)

    Africa, Colby

    2004-01-01

    I had been working for two years as the technical product manager for a large software company, when their partner company gave me a call. They needed good software engineers to customize a new version of software, and they thought I was their guy. They told me what they wanted to do to the software, and they even showed me some prototypes. Their idea was to take the basic software tool that the large company was producing and make it more accessible to the customer. They would do this by building in flexibility based on user skill level and organizational maturity. I thought that was a fascinating approach, and I bought into it in a big way. I decided to leave my job and join up with the smaller company as their director of software engineering.

  2. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  3. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    PubMed

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  4. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  5. Basic to Advanced InSAR Processing: GMTSAR

    NASA Astrophysics Data System (ADS)

    Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.

    2017-12-01

    Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.

  6. Streamlining Science: Three New Science Tools Make Data Collection a Snap

    ERIC Educational Resources Information Center

    Brown, Mike

    2006-01-01

    Today, collecting, evaluating, and analyzing data--the basic concepts of scientific study--usually involves electronic probeware. Probeware combines sensors that collect data with software that analyzes it once it has been sent to a computer or calculator. Science inquiry has benefited greatly from the use of electronic probeware, providing…

  7. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  8. Teach Graphic Design Basics with PowerPoint

    ERIC Educational Resources Information Center

    Lazaros, Edward J.; Spotts, Thomas H.

    2007-01-01

    While PowerPoint is generally regarded as simply software for creating slide presentations, it includes often overlooked--but powerful--drawing tools. Because it is part of the Microsoft Office package, PowerPoint comes preloaded on many computers and thus is already available in many classrooms. Since most computers are not preloaded with good…

  9. BATSE spectroscopy analysis system

    NASA Technical Reports Server (NTRS)

    Schaefer, Bradley E.; Bansal, Sandhia; Basu, Anju; Brisco, Phil; Cline, Thomas L.; Friend, Elliott; Laubenthal, Nancy; Panduranga, E. S.; Parkar, Nuru; Rust, Brad

    1992-01-01

    The Burst and Transient Source Experiment (BATSE) Spectroscopy Analysis System (BSAS) is the software system which is the primary tool for the analysis of spectral data from BATSE. As such, Guest Investigators and the community as a whole need to know its basic properties and characteristics. Described here are the characteristics of the BATSE spectroscopy detectors and the BSAS.

  10. A learning tool for optical and microwave satellite image processing and analysis

    NASA Astrophysics Data System (ADS)

    Dashondhi, Gaurav K.; Mohanty, Jyotirmoy; Eeti, Laxmi N.; Bhattacharya, Avik; De, Shaunak; Buddhiraju, Krishna M.

    2016-04-01

    This paper presents a self-learning tool, which contains a number of virtual experiments for processing and analysis of Optical/Infrared and Synthetic Aperture Radar (SAR) images. The tool is named Virtual Satellite Image Processing and Analysis Lab (v-SIPLAB) Experiments that are included in Learning Tool are related to: Optical/Infrared - Image and Edge enhancement, smoothing, PCT, vegetation indices, Mathematical Morphology, Accuracy Assessment, Supervised/Unsupervised classification etc.; Basic SAR - Parameter extraction and range spectrum estimation, Range compression, Doppler centroid estimation, Azimuth reference function generation and compression, Multilooking, image enhancement, texture analysis, edge and detection. etc.; SAR Interferometry - BaseLine Calculation, Extraction of single look SAR images, Registration, Resampling, and Interferogram generation; SAR Polarimetry - Conversion of AirSAR or Radarsat data to S2/C3/T3 matrix, Speckle Filtering, Power/Intensity image generation, Decomposition of S2/C3/T3, Classification of S2/C3/T3 using Wishart Classifier [3]. A professional quality polarimetric SAR software can be found at [8], a part of whose functionality can be found in our system. The learning tool also contains other modules, besides executable software experiments, such as aim, theory, procedure, interpretation, quizzes, link to additional reading material and user feedback. Students can have understanding of Optical and SAR remotely sensed images through discussion of basic principles and supported by structured procedure for running and interpreting the experiments. Quizzes for self-assessment and a provision for online feedback are also being provided to make this Learning tool self-contained. One can download results after performing experiments.

  11. An open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and industrial CAM software.

    PubMed

    Lu, Li; Liu, Shusheng; Shi, Shenggen; Yang, Jianzhong

    2011-10-01

    China-made 5-axis simultaneous contouring CNC machine tool and domestically developed industrial computer-aided manufacture (CAM) technology were used for full crown fabrication and measurement of crown accuracy, with an attempt to establish an open CAM system for dental processing and to promote the introduction of domestic dental computer-aided design (CAD)/CAM system. Commercially available scanning equipment was used to make a basic digital tooth model after preparation of crown, and CAD software that comes with the scanning device was employed to design the crown by using domestic industrial CAM software to process the crown data in order to generate a solid model for machining purpose, and then China-made 5-axis simultaneous contouring CNC machine tool was used to complete machining of the whole crown and the internal accuracy of the crown internal was measured by using 3D-MicroCT. The results showed that China-made 5-axis simultaneous contouring CNC machine tool in combination with domestic industrial CAM technology can be used for crown making and the crown was well positioned in die. The internal accuracy was successfully measured by using 3D-MicroCT. It is concluded that an open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and domestic industrial CAM software has been established, and development of the system will promote the introduction of domestically-produced dental CAD/CAM system.

  12. Collaboration and decision making tools for mobile groups

    NASA Astrophysics Data System (ADS)

    Abrahamyan, Suren; Balyan, Serob; Ter-Minasyan, Harutyun; Degtyarev, Alexander

    2017-12-01

    Nowadays the use of distributed collaboration tools is widespread in many areas of people activity. But lack of mobility and certain equipment-dependency creates difficulties and decelerates development and integration of such technologies. Also mobile technologies allow individuals to interact with each other without need of traditional office spaces and regardless of location. Hence, realization of special infrastructures on mobile platforms with help of ad-hoc wireless local networks could eliminate hardware-attachment and be useful also in terms of scientific approach. Solutions from basic internet-messengers to complex software for online collaboration equipment in large-scale workgroups are implementations of tools based on mobile infrastructures. Despite growth of mobile infrastructures, applied distributed solutions in group decisionmaking and e-collaboration are not common. In this article we propose software complex for real-time collaboration and decision-making based on mobile devices, describe its architecture and evaluate performance.

  13. Easy-to-use software tools for teaching the basics, design and applications of optical components and systems

    NASA Astrophysics Data System (ADS)

    Gerhard, Christoph; Adams, Geoff

    2015-10-01

    Geometric optics is at the heart of optics teaching. Some of us may remember using pins and string to test the simple lens equation at school. Matters get more complex at undergraduate/postgraduate levels as we are introduced to paraxial rays, real rays, wavefronts, aberration theory and much more. Software is essential for the later stages, and the right software can profitably be used even at school. We present two free PC programs, which have been widely used in optics teaching, and have been further developed in close cooperation with lecturers/professors in order to address the current content of the curricula for optics, photonics and lasers in higher education. PreDesigner is a single thin lens modeller. It illustrates the simple lens law with construction rays and then allows the user to include field size and aperture. Sliders can be used to adjust key values with instant graphical feedback. This tool thus represents a helpful teaching medium for the visualization of basic interrelations in optics. WinLens3DBasic can model multiple thin or thick lenses with real glasses. It shows the system focii, principal planes, nodal points, gives paraxial ray trace values, details the Seidel aberrations, offers real ray tracing and many forms of analysis. It is simple to reverse lenses and model tilts and decenters. This tool therefore provides a good base for learning lens design fundamentals. Much work has been put into offering these features in ways that are easy to use, and offer opportunities to enhance the student's background understanding.

  14. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    PubMed

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society for Leukocyte Biology.

  15. Learning and Teaching Mathematics through Real Life Models

    ERIC Educational Resources Information Center

    Takaci, Djurdjica; Budinski, Natalija

    2011-01-01

    This paper proposes modelling based learning as a tool for learning and teaching mathematics in high school. We report on an example of modelling real world problems in two high schools in Serbia where students were introduced for the first time to the basic concepts of modelling. Student use of computers and educational software, GeoGebra, was…

  16. SU-E-T-211: Comparison of Seven New TrueBeam Linacs with Enhanced Beam Data Conformance Using a Beam Comparison Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grzetic, S; Hessler, J; Gupta, N

    2015-06-15

    Purpose: To develop an independent software tool to assist in commissioning linacs with enhanced beam conformance, as well as perform ongoing QA for dosimetrically equivalent linacs. Methods: Linac manufacturers offer enhanced beam conformance as an option to allow for clinics to complete commissioning efficiently, as well as implement dosimetrically equivalent linacs. The specification for enhanced conformance includes PDD as well as profiles within 80% FWHM. Recently, we commissioned seven Varian TrueBeam linacs with enhanced beam conformance. We developed a software tool in Visual Basic to allow us to load the reference beam data and compare our beam data during commissioningmore » to evaluate enhanced beam conformance. This tool also allowed us to upload our beam data used for commissioning our dosimetrically equivalent beam models to compare and tweak each of our linac beams to match our modelled data in Varian’s Eclipse TPS. This tool will also be used during annual QA of the linacs to compare our beam data to our baseline data, as required by TG-142. Results: Our software tool was used to check beam conformance for seven TrueBeam linacs that we commissioned in the past six months. Using our tool we found that the factory conformed linacs showed up to 3.82% difference in their beam profile data upon installation. Using our beam comparison tool, we were able to adjust the energy and profiles of our beams to accomplish a better than 1.00% point by point data conformance. Conclusion: The availability of quantitative comparison tools is essential to accept and commission linacs with enhanced beam conformance, as well as to beam match multiple linacs. We further intend to use the same tool to ensure our beam data conforms to the commissioning beam data during our annual QA in keeping with the requirements of TG-142.« less

  17. ProtocolNavigator: emulation-based software for the design, documentation and reproduction biological experiments.

    PubMed

    Khan, Imtiaz A; Fraser, Adam; Bray, Mark-Anthony; Smith, Paul J; White, Nick S; Carpenter, Anne E; Errington, Rachel J

    2014-12-01

    Experimental reproducibility is fundamental to the progress of science. Irreproducible research decreases the efficiency of basic biological research and drug discovery and impedes experimental data reuse. A major contributing factor to irreproducibility is difficulty in interpreting complex experimental methodologies and designs from written text and in assessing variations among different experiments. Current bioinformatics initiatives either are focused on computational research reproducibility (i.e. data analysis) or laboratory information management systems. Here, we present a software tool, ProtocolNavigator, which addresses the largely overlooked challenges of interpretation and assessment. It provides a biologist-friendly open-source emulation-based tool for designing, documenting and reproducing biological experiments. ProtocolNavigator was implemented in Python 2.7, using the wx module to build the graphical user interface. It is a platform-independent software and freely available from http://protocolnavigator.org/index.html under the GPL v2 license. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. A graphical simulation software for instruction in cardiovascular mechanics physiology.

    PubMed

    Wildhaber, Reto A; Verrey, François; Wenger, Roland H

    2011-01-25

    Computer supported, interactive e-learning systems are widely used in the teaching of physiology. However, the currently available complimentary software tools in the field of the physiology of cardiovascular mechanics have not yet been adapted to the latest systems software. Therefore, a simple-to-use replacement for undergraduate and graduate students' education was needed, including an up-to-date graphical software that is validated and field-tested. Software compatible to Windows, based on modified versions of existing mathematical algorithms, has been newly developed. Testing was performed during a full term of physiological lecturing to medical and biology students. The newly developed CLabUZH software models a reduced human cardiovascular loop containing all basic compartments: an isolated heart including an artificial electrical stimulator, main vessels and the peripheral resistive components. Students can alter several physiological parameters interactively. The resulting output variables are printed in x-y diagrams and in addition shown in an animated, graphical model. CLabUZH offers insight into the relations of volume, pressure and time dependency in the circulation and their correlation to the electrocardiogram (ECG). Established mechanisms such as the Frank-Starling Law or the Windkessel Effect are considered in this model. The CLabUZH software is self-contained with no extra installation required and runs on most of today's personal computer systems. CLabUZH is a user-friendly interactive computer programme that has proved to be useful in teaching the basic physiological principles of heart mechanics.

  19. SACA: Software Assisted Call Analysis--an interactive tool supporting content exploration, online guidance and quality improvement of counseling dialogues.

    PubMed

    Trinkaus, Hans L; Gaisser, Andrea E

    2010-09-01

    Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1975-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.

  1. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    ERIC Educational Resources Information Center

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  2. Grammar Review: Your Tool for Success. Teacher Materials.

    ERIC Educational Resources Information Center

    Pittsburgh Univ., Johnstown, PA. Education Div.

    Teacher materials are provided for a computer-assisted English grammar curriculum for adult basic education students (1-8 grade level). They accompany a software program (diskette) that the student is able to use by himself/herself with the Apple IIc or Apple IIe computer with single or double drive and a monitor or a television with an R.F.…

  3. Digital Diversity: A Basic Tool with Lots of Uses

    ERIC Educational Resources Information Center

    Coy, Mary

    2006-01-01

    In this article the author relates how the digital camera has altered the way she teaches and the way her students learn. She also emphasizes the importance for teachers to have software that can edit, print, and incorporate photos. She cites several instances in which a digital camera can be used: (1) PowerPoint presentations; (2) Open house; (3)…

  4. Freeware eLearning Flash-ECG for learning electrocardiography.

    PubMed

    Romanov, Kalle; Kuusi, Timo

    2009-06-01

    Electrocardiographic (ECG) analysis can be taught in eLearning programmes with suitable software that permits the effective use of basic tools such as a ruler and a magnifier, required for measurements. The Flash-ECG (Research & Development Unit for Medical Education, University of Helsinki, Finland) was developed to enable teachers and students to use scanned and archived ECGs on computer screens and classroom projectors. The software requires only a standard web browser with a Flash plug-in and can be integrated with learning environments (Blackboard/WebCT, Moodle). The Flash-ECG is freeware and is available to medical teachers worldwide.

  5. Quantum Computing Architectural Design

    NASA Astrophysics Data System (ADS)

    West, Jacob; Simms, Geoffrey; Gyure, Mark

    2006-03-01

    Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.

  6. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  7. The RCSB Protein Data Bank: views of structural biology for basic and applied research and education

    PubMed Central

    Rose, Peter W.; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F.; Christie, Cole H.; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S.; Westbrook, John D.; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M.; Bourne, Philip E.; Burley, Stephen K.

    2015-01-01

    The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. PMID:25428375

  8. Basic Radar Altimetry Toolbox: Tools to Use Radar Altimetry for Geodesy

    NASA Astrophysics Data System (ADS)

    Rosmorduc, V.; Benveniste, J. J.; Bronner, E.; Niejmeier, S.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications and uses. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for geodesy, especially combined witht ESA GOCE mission data is still somehow hard. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed (as well as, on ESA side, the GOCE User Toolbox, both being linked). The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. Examples and Data use cases on geodesy will be presented. BRAT is developed under contract with ESA and CNES.

  9. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  10. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    NASA Astrophysics Data System (ADS)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision making.

  11. Application of GIS Rapid Mapping Technology in Disaster Monitoring

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Tu, J.; Liu, G.; Zhao, Q.

    2018-04-01

    With the rapid development of GIS and RS technology, especially in recent years, GIS technology and its software functions have been increasingly mature and enhanced. And with the rapid development of mathematical statistical tools for spatial modeling and simulation, has promoted the widespread application and popularization of quantization in the field of geology. Based on the investigation of field disaster and the construction of spatial database, this paper uses remote sensing image, DEM and GIS technology to obtain the data information of disaster vulnerability analysis, and makes use of the information model to carry out disaster risk assessment mapping.Using ArcGIS software and its spatial data modeling method, the basic data information of the disaster risk mapping process was acquired and processed, and the spatial data simulation tool was used to map the disaster rapidly.

  12. Activity Catalog Tool (ACT) user manual, version 2.0

    NASA Technical Reports Server (NTRS)

    Segal, Leon D.; Andre, Anthony D.

    1994-01-01

    This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.

  13. Data visualization and analysis tools for the MAVEN mission

    NASA Astrophysics Data System (ADS)

    Harter, B.; De Wolfe, A. W.; Putnam, B.; Brain, D.; Chaffin, M.

    2016-12-01

    The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. We have developed new software tools for exploring and analyzing the science data. Our open-source Python toolkit for working with data from MAVEN and other missions is based on the widely-used "tplot" IDL toolkit. We have replicated all of the basic tplot functionality in Python, and use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our software is available on Github. We have also been exploring 3D graphics as a way to better visualize the MAVEN science data and models. We have constructed a 3D visualization of MAVEN's orbit using the CesiumJS library, which not only allows viewing of MAVEN's orientation and position, but also allows the display of selected science data sets and their variation over time.

  14. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  15. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  16. Remote Sensing Image Analysis Without Expert Knowledge - A Web-Based Classification Tool On Top of Taverna Workflow Management System

    NASA Astrophysics Data System (ADS)

    Selsam, Peter; Schwartze, Christian

    2016-10-01

    Providing software solutions via internet has been known for quite some time and is now an increasing trend marketed as "software as a service". A lot of business units accept the new methods and streamlined IT strategies by offering web-based infrastructures for external software usage - but geospatial applications featuring very specialized services or functionalities on demand are still rare. Originally applied in desktop environments, the ILMSimage tool for remote sensing image analysis and classification was modified in its communicating structures and enabled for running on a high-power server and benefiting from Tavema software. On top, a GIS-like and web-based user interface guides the user through the different steps in ILMSimage. ILMSimage combines object oriented image segmentation with pattern recognition features. Basic image elements form a construction set to model for large image objects with diverse and complex appearance. There is no need for the user to set up detailed object definitions. Training is done by delineating one or more typical examples (templates) of the desired object using a simple vector polygon. The template can be large and does not need to be homogeneous. The template is completely independent from the segmentation. The object definition is done completely by the software.

  17. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  18. Innovation Online Teaching Module Plus Digital Engineering Kit with Proteus Software through Hybrid Learning Method to Improve Student Skills

    NASA Astrophysics Data System (ADS)

    Kholis, Nur; Syariffuddien Zuhrie, Muhamad; Rahmadian, Reza

    2018-04-01

    Demands the competence (competence) needs of the industry today is a competent workforce to the field of work. However, during this lecture material Digital Engineering (Especially Digital Electronics Basics and Digital Circuit Basics) is limited to the delivery of verbal form of lectures (classical method) is dominated by the Lecturer (Teacher Centered). Though the subject of Digital Engineering requires learning tools and is required understanding of electronic circuits, digital electronics and high logic circuits so that learners can apply in the world of work. One effort to make it happen is by creating an online teaching module and educational aids (Kit) with the help of Proteus software that can improve the skills of learners. This study aims to innovate online teaching modules plus kits in Proteus-assisted digital engineering courses through hybrid learning approaches to improve the skills of learners. The process of innovation is done by considering the skills and mastery of the technology of students (students) Department of Electrical Engineering - Faculty of Engineering – Universitas Negeri Surabaya to produce quality graduates Use of online module plus Proteus software assisted kit through hybrid learning approach. In general, aims to obtain adequate results with affordable cost of investment, user friendly, attractive and interactive (easily adapted to the development of Information and Communication Technology). With the right design, implementation and operation, both in the form of software both in the form of Online Teaching Module, offline teaching module, Kit (Educational Viewer), and e-learning learning content (both online and off line), the use of the three tools of the expenditure will be able to adjust the standard needs of Information and Communication Technology world, both nationally and internationally.

  19. Program Management Tool

    NASA Technical Reports Server (NTRS)

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity issues among various NASA systems that impact schedules and planning.

  20. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  1. mMass 3: a cross-platform software environment for precise analysis of mass spectrometric data.

    PubMed

    Strohalm, Martin; Kavan, Daniel; Novák, Petr; Volný, Michael; Havlícek, Vladimír

    2010-06-01

    While tools for the automated analysis of MS and LC-MS/MS data are continuously improving, it is still often the case that at the end of an experiment, the mass spectrometrist will spend time carefully examining individual spectra. Current software support is mostly provided only by the instrument vendors, and the available software tools are often instrument-dependent. Here we present a new generation of mMass, a cross-platform environment for the precise analysis of individual mass spectra. The software covers a wide range of processing tasks such as import from various data formats, smoothing, baseline correction, peak picking, deisotoping, charge determination, and recalibration. Functions presented in the earlier versions such as in silico digestion and fragmentation were redesigned and improved. In addition to Mascot, an interface for ProFound has been implemented. A specific tool is available for isotopic pattern modeling to enable precise data validation. The largest available lipid database (from the LIPID MAPS Consortium) has been incorporated and together with the new compound search tool lipids can be rapidly identified. In addition, the user can define custom libraries of compounds and use them analogously. The new version of mMass is based on a stand-alone Python library, which provides the basic functionality for data processing and interpretation. This library can serve as a good starting point for other developers in their projects. Binary distributions of mMass, its source code, a detailed user's guide, and video tutorials are freely available from www.mmass.org .

  2. Link Analysis in the Mission Planning Lab

    NASA Technical Reports Server (NTRS)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  3. SIMBA: a web tool for managing bacterial genome assembly generated by Ion PGM sequencing technology.

    PubMed

    Mariano, Diego C B; Pereira, Felipe L; Aguiar, Edgar L; Oliveira, Letícia C; Benevides, Leandro; Guimarães, Luís C; Folador, Edson L; Sousa, Thiago J; Ghosh, Preetam; Barh, Debmalya; Figueiredo, Henrique C P; Silva, Artur; Ramos, Rommel T J; Azevedo, Vasco A C

    2016-12-15

    The evolution of Next-Generation Sequencing (NGS) has considerably reduced the cost per sequenced-base, allowing a significant rise of sequencing projects, mainly in prokaryotes. However, the range of available NGS platforms requires different strategies and software to correctly assemble genomes. Different strategies are necessary to properly complete an assembly project, in addition to the installation or modification of various software. This requires users to have significant expertise in these software and command line scripting experience on Unix platforms, besides possessing the basic expertise on methodologies and techniques for genome assembly. These difficulties often delay the complete genome assembly projects. In order to overcome this, we developed SIMBA (SImple Manager for Bacterial Assemblies), a freely available web tool that integrates several component tools for assembling and finishing bacterial genomes. SIMBA provides a friendly and intuitive user interface so bioinformaticians, even with low computational expertise, can work under a centralized administrative control system of assemblies managed by the assembly center head. SIMBA guides the users to execute assembly process through simple and interactive pages. SIMBA workflow was divided in three modules: (i) projects: allows a general vision of genome sequencing projects, in addition to data quality analysis and data format conversions; (ii) assemblies: allows de novo assemblies with the software Mira, Minia, Newbler and SPAdes, also assembly quality validations using QUAST software; and (iii) curation: presents methods to finishing assemblies through tools for scaffolding contigs and close gaps. We also presented a case study that validated the efficacy of SIMBA to manage bacterial assemblies projects sequenced using Ion Torrent PGM. Besides to be a web tool for genome assembly, SIMBA is a complete genome assemblies project management system, which can be useful for managing of several projects in laboratories. SIMBA source code is available to download and install in local webservers at http://ufmg-simba.sourceforge.net .

  4. Hardware and software improvements to a low-cost horizontal parallax holographic video monitor.

    PubMed

    Henrie, Andrew; Codling, Jesse R; Gneiting, Scott; Christensen, Justin B; Awerkamp, Parker; Burdette, Mark J; Smalley, Daniel E

    2018-01-01

    Displays capable of true holographic video have been prohibitively expensive and difficult to build. With this paper, we present a suite of modularized hardware components and software tools needed to build a HoloMonitor with basic "hacker-space" equipment, highlighting improvements that have enabled the total materials cost to fall to $820, well below that of other holographic displays. It is our hope that the current level of simplicity, development, design flexibility, and documentation will enable the lay engineer, programmer, and scientist to relatively easily replicate, modify, and build upon our designs, bringing true holographic video to the masses.

  5. Scientific Computation Application Partnerships in Materials and Chemical Sciences, Charge Transfer and Charge Transport in Photoactivated Systems, Developing Electron-Correlated Methods for Excited State Structure and Dynamics in the NWChem Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cramer, Christopher J.

    Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.

  6. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  7. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    PubMed Central

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM

    2006-01-01

    Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197

  8. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data.

    PubMed

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M

    2006-10-13

    Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.

  9. Visual programming for next-generation sequencing data analytics.

    PubMed

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  10. Monte Carlo Methodology Serves Up a Software Success

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Widely used for the modeling of gas flows through the computation of the motion and collisions of representative molecules, the Direct Simulation Monte Carlo method has become the gold standard for producing research and engineering predictions in the field of rarefied gas dynamics. Direct Simulation Monte Carlo was first introduced in the early 1960s by Dr. Graeme Bird, a professor at the University of Sydney, Australia. It has since proved to be a valuable tool to the aerospace and defense industries in providing design and operational support data, as well as flight data analysis. In 2002, NASA brought to the forefront a software product that maintains the same basic physics formulation of Dr. Bird's method, but provides effective modeling of complex, three-dimensional, real vehicle simulations and parallel processing capabilities to handle additional computational requirements, especially in areas where computational fluid dynamics (CFD) is not applicable. NASA's Direct Simulation Monte Carlo Analysis Code (DAC) software package is now considered the Agency s premier high-fidelity simulation tool for predicting vehicle aerodynamics and aerothermodynamic environments in rarified, or low-density, gas flows.

  11. Teaching and assessment of mathematical principles for software correctness using a reasoning concept inventory

    NASA Astrophysics Data System (ADS)

    Drachova-Strang, Svetlana V.

    As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for reasoning about software correctness. This dissertation presents a systematic approach to both introducing these reasoning skills into the curriculum, and assessing how well the students have learned them. Specifically, it introduces a comprehensive Reasoning Concept Inventory (RCI) that captures the fine details of basic reasoning skills that are ideally learned across the undergraduate curriculum to reason about software correctness, to develop high quality software, and to understand why software works as specified. The RCI forms the basis for developing learning outcomes that help educators to assess the adequacy of current techniques and pinpoint necessary improvements. This dissertation contains results from experimentation and assessment over the past few years in multiple CS courses. The results show that the finer principles of mathematical reasoning of software correctness can be taught effectively and continuously improved with the help of the RCI using suitable teaching practices, and supporting methods and tools.

  12. The Educational Software Marketplace and Adult Literacy Niches. Contractor Report, Adult Literacy and New Technologies: Tools for a Lifetime.

    ERIC Educational Resources Information Center

    Education Turnkey Systems, Inc., Falls Church, VA.

    Over the past 10 years computer technology has come to occupy a central place in American life and has caused a redefinition of the level of literacy skills needed to participate effectively in American society. At the same time, some 20 to 30 million adults have serious problems of basic literacy. Within this context, the Office of Technology…

  13. Software "Socrative" and Smartphones as Tools for Implementation of Basic Processes of Active Physics Learning in Classroom: An Initial Feasibility Study with Prospective Teachers

    ERIC Educational Resources Information Center

    Méndez Coca, David; Slisko, Josip

    2013-01-01

    Many physics professors have difficulties to know and assess in real time the learning of the students in their courses. Nevertheless, today, with Internet and the new technology devices that the students use every day, like smartphones, such tasks can be carried out relatively easy. The professor pose a few questions in "Socrative," the…

  14. Demonstration of theoretical and experimental simulations in fiber optics course

    NASA Astrophysics Data System (ADS)

    Yao, Tianfu; Wang, Xiaolin; Shi, Jianhua; Lei, Bing; Liu, Wei; Wang, Wei; Hu, Haojun

    2017-08-01

    "Fiber optics" course plays a supporting effect in the curriculum frame of optics and photonics at both undergraduate and postgraduate levels. Moreover, the course can be treated as compulsory for students specialized in the fiber-related field, such as fiber communication, fiber sensing and fiber light source. The corresponding content in fiber optics requires the knowledge of geometrical and physical optics as background, including basic optical theory and fiber components in practice. Thus, to help the students comprehend the relatively abundant and complex content, it is necessary to investigate novel teaching method assistant the classic lectures. In this paper, we introduce the multidimensional pattern in fiber-optics teaching involving theoretical and laboratory simulations. First, the theoretical simulations is demonstrated based on the self-developed software named "FB tool" which can be installed in both smart phone with Android operating system and personal computer. FB tool covers the fundamental calculations relating to transverse modes, fiber lasers and nonlinearities and so on. By comparing the calculation results with other commercial software like COMSOL, SFTool shows high accuracy with high speed. Then the laboratory simulations are designed including fiber coupling, Erbium doped fiber amplifiers, fiber components and so on. The simulations not only supports students understand basic knowledge in the course, but also provides opportunities to develop creative projects in fiber optics.

  15. Development of the updated system of city underground pipelines based on Visual Studio

    NASA Astrophysics Data System (ADS)

    Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong

    2009-10-01

    Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.

  16. A Legal Guide for the Software Developer.

    ERIC Educational Resources Information Center

    Minnesota Small Business Assistance Office, St. Paul.

    This booklet has been prepared to familiarize the inventor, creator, or developer of a new computer software product or software invention with the basic legal issues involved in developing, protecting, and distributing the software in the United States. Basic types of software protection and related legal matters are discussed in detail,…

  17. A strip chart recorder pattern recognition tool kit for Shuttle operations

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.

    1993-01-01

    During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.

  18. The RCSB Protein Data Bank: views of structural biology for basic and applied research and education.

    PubMed

    Rose, Peter W; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F; Christie, Cole H; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S; Westbrook, John D; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M; Bourne, Philip E; Burley, Stephen K

    2015-01-01

    The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Object-oriented approach to fast display of electrophysiological data under MS-windows.

    PubMed

    Marion-Poll, F

    1995-12-01

    Microcomputers provide neuroscientists an alternative to a host of laboratory equipment to record and analyze electrophysiological data. Object-oriented programming tools bring an essential link between custom needs for data acquisition and analysis with general software packages. In this paper, we outline the layout of basic objects that display and manipulate electrophysiological data files. Visual inspection of the recordings is a basic requirement of any data analysis software. We present an approach that allows flexible and fast display of large data sets. This approach involves constructing an intermediate representation of the data in order to lower the number of actual points displayed while preserving the aspect of the data. The second group of objects is related to the management of lists of data files. Typical experiments designed to test the biological activity of pharmacological products include scores of files. Data manipulation and analysis are facilitated by creating multi-document objects that include the names of all experiment files. Implementation steps of both objects are described for an MS-Windows hosted application.

  20. MatMRI and MatHIFU: software toolboxes for real-time monitoring and control of MR-guided HIFU

    PubMed Central

    2013-01-01

    Background The availability of open and versatile software tools is a key feature to facilitate pre-clinical research for magnetic resonance imaging (MRI) and magnetic resonance-guided high-intensity focused ultrasound (MR-HIFU) and expedite clinical translation of diagnostic and therapeutic medical applications. In the present study, two customizable software tools that were developed at the Thunder Bay Regional Research Institute are presented for use with both MRI and MR-HIFU. Both tools operate in a MATLAB®; environment. The first tool is named MatMRI and enables real-time, dynamic acquisition of MR images with a Philips MRI scanner. The second tool is named MatHIFU and enables the execution and dynamic modification of user-defined treatment protocols with the Philips Sonalleve MR-HIFU therapy system to perform ultrasound exposures in MR-HIFU therapy applications. Methods MatMRI requires four basic steps: initiate communication, subscribe to MRI data, query for new images, and unsubscribe. MatMRI can also pause/resume the imaging and perform real-time updates of the location and orientation of images. MatHIFU requires four basic steps: initiate communication, prepare treatment protocol, and execute treatment protocol. MatHIFU can monitor the state of execution and, if required, modify the protocol in real time. Results Four applications were developed to showcase the capabilities of MatMRI and MatHIFU to perform pre-clinical research. Firstly, MatMRI was integrated with an existing small animal MR-HIFU system (FUS Instruments, Toronto, Ontario, Canada) to provide real-time temperature measurements. Secondly, MatMRI was used to perform T2-based MR thermometry in the bone marrow. Thirdly, MatHIFU was used to automate acoustic hydrophone measurements on a per-element basis of the 256-element transducer of the Sonalleve system. Finally, MatMRI and MatHIFU were combined to produce and image a heating pattern that recreates the word ‘HIFU’ in a tissue-mimicking heating phantom. Conclusions MatMRI and MatHIFU leverage existing MRI and MR-HIFU clinical platforms to facilitate pre-clinical research. MatMRI substantially simplifies the real-time acquisition and processing of MR data. MatHIFU facilitates the testing and characterization of new therapy applications using the Philips Sonalleve clinical MR-HIFU system. Under coordination with Philips Healthcare, both MatMRI and MatHIFU are intended to be freely available as open-source software packages to other research groups. PMID:25512856

  1. Development and assessment of a digital X-ray software tool to determine vertebral rotation in adolescent idiopathic scoliosis.

    PubMed

    Eijgenraam, Susanne M; Boselie, Toon F M; Sieben, Judith M; Bastiaenen, Caroline H G; Willems, Paul C; Arts, Jacobus J; Lataster, Arno

    2017-02-01

    The amount of vertebral rotation in the axial plane is of key importance in the prognosis and treatment of adolescent idiopathic scoliosis (AIS). Current methods to determine vertebral rotation are either designed for use in analogue plain radiographs and not useful in digital images, or lack measurement precision and are therefore less suitable for the follow-up of rotation in AIS patients. This study aimed to develop a digital X-ray software tool with high measurement precision to determine vertebral rotation in AIS, and to assess its (concurrent) validity and reliability. In this study a combination of basic science and reliability methodology applied in both laboratory and clinical settings was used. Software was developed using the algorithm of the Perdriolle torsion meter for analogue AP plain radiographs of the spine. Software was then assessed for (1) concurrent validity and (2) intra- and interobserver reliability. Plain radiographs of both human cadaver vertebrae and outpatient AIS patients were used. Concurrent validity was measured by two independent observers, both experienced in the assessment of plain radiographs. Reliability-measurements were performed by three independent spine surgeons. Pearson correlation of the software compared with the analogue Perdriolle torsion meter for mid-thoracic vertebrae was 0.98, for low-thoracic vertebrae 0.97 and for lumbar vertebrae 0.97. Measurement exactness of the software was within 5° in 62% of cases and within 10° in 97% of cases. Intraclass correlation coefficient (ICC) for inter-observer reliability was 0.92 (0.91-0.95), ICC for intra-observer reliability was 0.96 (0.94-0.97). We developed a digital X-ray software tool to determine vertebral rotation in AIS with a substantial concurrent validity and reliability, which may be useful for the follow-up of vertebral rotation in AIS patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  3. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  4. Basic Radar Altimetry Toolbox: Tools and Tutorial To Use Radar Altimetry For Cryosphere

    NASA Astrophysics Data System (ADS)

    Benveniste, J. J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious, especially for new Altimetry data products users. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them, including teachers and professors. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are under development, some are in discussion for the future. Data use cases on cryosphere applications will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  5. STOP-IT: Windows executable software for the stop-signal paradigm.

    PubMed

    Verbruggen, Frederick; Logan, Gordon D; Stevens, Michaël A

    2008-05-01

    The stop-signal paradigm is a useful tool for the investigation of response inhibition. In this paradigm, subjects are instructed to respond as fast as possible to a stimulus unless a stop signal is presented after a variable delay. However, programming the stop-signal task is typically considered to be difficult. To overcome this issue, we present software called STOP-IT, for running the stop-signal task, as well as an additional analyzing program called ANALYZE-IT. The main advantage of both programs is that they are a precompiled executable, and for basic use there is no need for additional programming. STOP-IT and ANALYZE-IT are completely based on free software, are distributed under the GNU General Public License, and are available at the personal Web sites of the first two authors or at expsy.ugent.be/tscope/stop.html.

  6. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  7. Presenting an evaluation model of the trauma registry software.

    PubMed

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software and sub criteria related to each main criteria separately. This model was validated by experts in this field. Therefore, this model can be used as a comprehensive model and a standard evaluation tool for measuring efficiency and effectiveness and performance improvement of trauma registry software. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple quality control sample types as well as experimental samples in one or more measurement sequences.

  9. MNE Scan: Software for real-time processing of electrophysiological data.

    PubMed

    Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph

    2018-06-01

    Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Problem solving with genetic algorithms and Splicer

    NASA Technical Reports Server (NTRS)

    Bayer, Steven E.; Wang, Lui

    1991-01-01

    Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.

  11. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  12. GeneFisher-P: variations of GeneFisher as processes in Bio-jETI

    PubMed Central

    Lamprecht, Anna-Lena; Margaria, Tiziana; Steffen, Bernhard; Sczyrba, Alexander; Hartmeier, Sven; Giegerich, Robert

    2008-01-01

    Background PCR primer design is an everyday, but not trivial task requiring state-of-the-art software. We describe the popular tool GeneFisher and explain its recent restructuring using workflow techniques. We apply a service-oriented approach to model and implement GeneFisher-P, a process-based version of the GeneFisher web application, as a part of the Bio-jETI platform for service modeling and execution. We show how to introduce a flexible process layer to meet the growing demand for improved user-friendliness and flexibility. Results Within Bio-jETI, we model the process using the jABC framework, a mature model-driven, service-oriented process definition platform. We encapsulate remote legacy tools and integrate web services using jETI, an extension of the jABC for seamless integration of remote resources as basic services, ready to be used in the process. Some of the basic services used by GeneFisher are in fact already provided as individual web services at BiBiServ and can be directly accessed. Others are legacy programs, and are made available to Bio-jETI via the jETI technology. The full power of service-based process orientation is required when more bioinformatics tools, available as web services or via jETI, lead to easy extensions or variations of the basic process. This concerns for instance variations of data retrieval or alignment tools as provided by the European Bioinformatics Institute (EBI). Conclusions The resulting service- and process-oriented GeneFisher-P demonstrates how basic services from heterogeneous sources can be easily orchestrated in the Bio-jETI platform and lead to a flexible family of specialized processes tailored to specific tasks. PMID:18460174

  13. GlycCompSoft: Software for Automated Comparison of Low Molecular Weight Heparins Using Top-Down LC/MS Data

    PubMed Central

    Li, Lingyun; Zhang, Fuming; Hu, Min; Ren, Fuji; Chi, Lianli; Linhardt, Robert J.

    2016-01-01

    Low molecular weight heparins are complex polycomponent drugs that have recently become amenable to top-down analysis using liquid chromatography-mass spectrometry. Even using open source deconvolution software, DeconTools, and automatic structural assignment software, GlycReSoft, the comparison of two or more low molecular weight heparins is extremely time-consuming, taking about a week for an expert analyst and provides no guarantee of accuracy. Efficient data processing tools are required to improve analysis. This study uses the programming language of Microsoft Excel™ Visual Basic for Applications to extend its standard functionality for macro functions and specific mathematical modules for mass spectrometric data processing. The program developed enables the comparison of top-down analytical glycomics data on two or more low molecular weight heparins. The current study describes a new program, GlycCompSoft, which has a low error rate with good time efficiency in the automatic processing of large data sets. The experimental results based on three lots of Lovenox®, Clexane® and three generic enoxaparin samples show that the run time of GlycCompSoft decreases from 11 to 2 seconds when the data processed decreases from 18000 to 1500 rows. PMID:27942011

  14. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST)

    PubMed Central

    Dowd, Scot E; Zaragoza, Joaquin; Rodriguez, Javier R; Oliver, Melvin J; Payton, Paxton R

    2005-01-01

    Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST), which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN). W.ND-BLAST provides intuitive Graphic User Interfaces (GUI) for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours) on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV) and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is freely downloadable from . With registration the software is free, installation, networking, and usage instructions are provided as well as a support forum. PMID:15819992

  15. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  16. CONRAD—A software framework for cone-beam imaging in radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Andreas; Choi, Jang-Hwan; Riess, Christian

    2013-11-15

    Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform withmore » extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.« less

  17. An Implementation Methodology and Software Tool for an Entropy Based Engineering Model for Evolving Systems

    DTIC Science & Technology

    2003-06-01

    delivery Data Access (1980s) "What were unit sales in New England last March?" Relational databases (RDBMS), Structured Query Language ( SQL ...macros written in Visual Basic for Applications ( VBA ). 32 Iteration Two: Class Diagram Tech OASIS Export ScriptImport Filter Data ProcessingMethod 1...MS Excel * 1 VBA Macro*1 contains sends data to co nt ai ns executes * * 1 1 contains contains Figure 20. Iteration two class diagram The

  18. Installation Mapping Enables Many Missions: The Benefits of and Barriers to Sharing Geospatial Data Assets

    DTIC Science & Technology

    2007-01-01

    software applications and rely on the installations to supply them with the basic I&E geospatial data - sets for those applications. Such...spatial data in geospatially based tools to help track military supplies and materials all over the world. For instance, SDDCTEA developed IRRIS, a...regional offices or individual installations to supply the data and perform QA/QC in the process. The IVT program office worked with the installations and

  19. Towards open-source, low-cost haptics for surgery simulation.

    PubMed

    Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie

    2014-01-01

    In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.

  20. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  1. The "neuro-mapping locator" software. A real-time intraoperative objective paraesthesia mapping tool to evaluate paraesthesia coverage of the painful zone in patients undergoing spinal cord stimulation lead implantation.

    PubMed

    Guetarni, F; Rigoard, P

    2015-03-01

    Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming of tomorrow's neurostimulators. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  2. Evaluation of tools for highly variable gene discovery from single-cell RNA-seq data.

    PubMed

    Yip, Shun H; Sham, Pak Chung; Wang, Junwen

    2018-02-21

    Traditional RNA sequencing (RNA-seq) allows the detection of gene expression variations between two or more cell populations through differentially expressed gene (DEG) analysis. However, genes that contribute to cell-to-cell differences are not discoverable with RNA-seq because RNA-seq samples are obtained from a mixture of cells. Single-cell RNA-seq (scRNA-seq) allows the detection of gene expression in each cell. With scRNA-seq, highly variable gene (HVG) discovery allows the detection of genes that contribute strongly to cell-to-cell variation within a homogeneous cell population, such as a population of embryonic stem cells. This analysis is implemented in many software packages. In this study, we compare seven HVG methods from six software packages, including BASiCS, Brennecke, scLVM, scran, scVEGs and Seurat. Our results demonstrate that reproducibility in HVG analysis requires a larger sample size than DEG analysis. Discrepancies between methods and potential issues in these tools are discussed and recommendations are made.

  3. Comparison of software packages for detecting differential expression in RNA-seq studies

    PubMed Central

    Seyednasrollah, Fatemeh; Laiho, Asta

    2015-01-01

    RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. PMID:24300110

  4. Comparison of software packages for detecting differential expression in RNA-seq studies.

    PubMed

    Seyednasrollah, Fatemeh; Laiho, Asta; Elo, Laura L

    2015-01-01

    RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. © The Author 2013. Published by Oxford University Press.

  5. ImTK: an open source multi-center information management toolkit

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  6. Analyzing Saturn's Magnetospheric Data After Cassini - Improving and Future-Proofing Cassini / MAPS Tools and Data

    NASA Astrophysics Data System (ADS)

    Brown, L. E.; Faden, J.; Vandegriff, J. D.; Kurth, W. S.; Mitchell, D. G.

    2017-12-01

    We present a plan to provide enhanced longevity to analysis software and science data used throughout the Cassini mission for viewing Magnetosphere and Plasma Science (MAPS) data. While a final archive is being prepared for Cassini, the tools that read from this archive will eventually become moribund as real world hardware and software systems evolve. We will add an access layer over existing and planned Cassini data products that will allow multiple tools to access many public MAPS datasets. The access layer is called the Heliophysics Application Programmer's Interface (HAPI), and this is a mechanism being adopted at many data centers across Heliophysics and planetary science for the serving of time series data. Two existing tools are also being enhanced to read from HAPI servers, namely Autoplot from the University of Iowa and MIDL (Mission Independent Data Layer) from The Johns Hopkins Applied Physics Lab. Thus both tools will be able to access data from RPWS, MAG, CAPS, and MIMI. In addition to being able to access data from each other's institutions, these tools will be able to read from all the new datasets expected to come online using the HAPI standard in the near future. The PDS also plans to use HAPI for all the holdings at the Planetary and Plasma Interactions (PPI) node. A basic presentation of the new HAPI data server mechanism is presented, as is an early demonstration of the modified tools.

  7. The Azimuth Project: an Open-Access Educational Resource

    NASA Astrophysics Data System (ADS)

    Baez, J. C.

    2012-12-01

    The Azimuth Project is an online collaboration of scientists, engineers and programmers who are volunteering their time to do something about a wide range of environmental problems. The project has several aspects: 1) a wiki designed to make reliable, sourced information easy to find and accessible to a technically literate nonexperts, 2) a blog featuring expository articles and news items, 3) a project to write programs that explain basic concepts of climate physics and illustrate principles of good open-source software design, and 4) a project to develop mathematical tools for studying complex networked systems. We discuss the progress so far and some preliminary lessons. For example, enlisting the help of experts outside academia highlights the problems with pay-walled journals and the benefits of open access, as well as differences between how software development is done commercially, in the free software community, and in academe.

  8. Can a customer relationship management program improve recruitment for primary care research studies?

    PubMed

    Johnston, Sharon; Wong, Sabrina T; Blackman, Stephanie; Chau, Leena W; Grool, Anne M; Hogg, William

    2017-11-16

    Recruiting family physicians into primary care research studies requires researchers to continually manage information coming in, going out, and coming in again. In many research groups, Microsoft Excel and Access are the usual data management tools, but they are very basic and do not support any automation, linking, or reminder systems to manage and integrate recruitment information and processes. We explored whether a commercial customer relationship management (CRM) software program - designed for sales people in businesses to improve customer relations and communications - could be used to make the research recruitment system faster, more effective, and more efficient. We found that while there was potential for long-term studies, it simply did not adapt effectively enough for our shorter study and recruitment budget. The amount of training required to master the software and our need for ongoing flexible and timely support were greater than the benefit of using CRM software for our study.

  9. The analysis of the accuracy of spatial models using photogrammetric software: Agisoft Photoscan and Pix4D

    NASA Astrophysics Data System (ADS)

    Barbasiewicz, Adrianna; Widerski, Tadeusz; Daliga, Karol

    2018-01-01

    This article was created as a result of research conducted within the master thesis. The purpose of the measurements was to analyze the accuracy of the positioning of points by computer programs. Selected software was a specialized computer software dedicated to photogrammetric work. For comparative purposes it was decided to use tools with similar functionality. As the basic parameters that affect the results selected the resolution of the photos on which the key points were searched. In order to determine the location of the determined points, it was decided to follow the photogrammetric resection rule. In order to automate the measurement, the measurement session planning was omitted. The coordinates of the points collected by the tachymetric measure were used as a reference system. The resulting deviations and linear displacements oscillate in millimeters. The visual aspects of the cloud points have also been briefly analyzed.

  10. Project Management Software for Distributed Industrial Companies

    NASA Astrophysics Data System (ADS)

    Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.

    This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.

  11. Application of the Golden Software Surfer mapping software for automation of visualisation of meteorological and oceanographic data in IMGW Maritime Branch.

    NASA Astrophysics Data System (ADS)

    Piliczewski, B.

    2003-04-01

    The Golden Software Surfer has been used in IMGW Maritime Branch for more than ten years. This tool provides ActiveX Automation objects, which allow scripts to control practically every feature of Surfer. These objects can be accessed from any Automation-enabled environment, such as Visual Basic or Excel. Several applications based on Surfer has been developed in IMGW. The first example is an on-line oceanographic service, which presents forecasts of the water temperature, sea level and currents originating from the HIROMB model and is automatically updated every day. Surfer was also utilised in MERMAID, an international project supported by EC under the 5th Framework Programme. The main aim of this project was to create a prototype of the Internet-based data brokerage system, which would enable to search, extract, buy and download datasets containing meteorological or oceanographic data. During the project IMGW developed an online application, called Mermaid Viewer, which enables communication with the data broker and automatic visualisation of the downloaded data using Surfer. Both the above mentioned applications were developed in Visual Basic. Currently it is considered to adopt Surfer for the monitoring service, which provides access to the data collected in the monitoring of the Baltic Sea environment.

  12. A qualitative study on personal information management (PIM) in clinical and basic sciences faculty members of a medical university in Iran

    PubMed Central

    Sedghi, Shahram; Abdolahi, Nida; Azimi, Ali; Tahamtan, Iman; Abdollahi, Leila

    2015-01-01

    Background: Personal Information Management (PIM) refers to the tools and activities to save and retrieve personal information for future uses. This study examined the PIM activities of faculty members of Iran University of Medical Sciences (IUMS) regarding their preferred PIM tools and four aspects of acquiring, organizing, storing and retrieving personal information. Methods: The qualitative design was based on phenomenology approach and we carried out 37 interviews with clinical and basic sciences faculty members of IUMS in 2014. The participants were selected using a random sampling method. All interviews were recorded by a digital voice recorder, and then transcribed, codified and finally analyzed using NVivo 8 software. Results: The use of PIM electronic tools (e-tools) was below expectation among the studied sample and just 37% had reasonable knowledge of PIM e-tools such as, external hard drivers, flash memories etc. However, all participants used both paper and electronic devices to store and access information. Internal mass memories (in Laptops) and flash memories were the most used e-tools to save information. Most participants used "subject" (41.00%) and "file name" (33.7 %) to save, organize and retrieve their stored information. Most users preferred paper-based rather than electronic tools to keep their personal information. Conclusion: Faculty members had little knowledge about PIM techniques and tools. Those who organized personal information could easier retrieve the stored information for future uses. Enhancing familiarity with PIM tools and training courses of PIM tools and techniques are suggested. PMID:26793648

  13. Agreement Between Face-to-Face and Free Software Video Analysis for Assessing Hamstring Flexibility in Adolescents.

    PubMed

    Moral-Muñoz, José A; Esteban-Moreno, Bernabé; Arroyo-Morales, Manuel; Cobo, Manuel J; Herrera-Viedma, Enrique

    2015-09-01

    The objective of this study was to determine the level of agreement between face-to-face hamstring flexibility measurements and free software video analysis in adolescents. Reduced hamstring flexibility is common in adolescents (75% of boys and 35% of girls aged 10). The length of the hamstring muscle has an important role in both the effectiveness and the efficiency of basic human movements, and reduced hamstring flexibility is related to various musculoskeletal conditions. There are various approaches to measuring hamstring flexibility with high reliability; the most commonly used approaches in the scientific literature are the sit-and-reach test, hip joint angle (HJA), and active knee extension. The assessment of hamstring flexibility using video analysis could help with adolescent flexibility follow-up. Fifty-four adolescents from a local school participated in a descriptive study of repeated measures using a crossover design. Active knee extension and HJA were measured with an inclinometer and were simultaneously recorded with a video camera. Each video was downloaded to a computer and subsequently analyzed using Kinovea 0.8.15, a free software application for movement analysis. All outcome measures showed reliability estimates with α > 0.90. The lowest reliability was obtained for HJA (α = 0.91). The preliminary findings support the use of a free software tool for assessing hamstring flexibility, offering health professionals a useful tool for adolescent flexibility follow-up.

  14. Open source software in a practical approach for post processing of radiologic images.

    PubMed

    Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea

    2015-03-01

    The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.

  15. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    PubMed

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  16. ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.

    PubMed

    Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus

    2011-12-01

    The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.

  17. FILTSoft: A computational tool for microstrip planar filter design

    NASA Astrophysics Data System (ADS)

    Elsayed, M. H.; Abidin, Z. Z.; Dahlan, S. H.; Cholan N., A.; Ngu, Xavier T. I.; Majid, H. A.

    2017-09-01

    Filters are key component of any communication system to control spectrum and suppress interferences. Designing a filter involves long process as well as good understanding of the basic hardware technology. Hence this paper introduces an automated design tool based on Matlab-GUI, called the FILTSoft (acronym for Filter Design Software) to ease the process. FILTSoft is a user friendly filter design tool to aid, guide and expedite calculations from lumped elements level to microstrip structure. Users just have to provide the required filter specifications as well as the material description. FILTSoft will calculate and display the lumped element details, the planar filter structure, and the expected filter's response. An example of a lowpass filter design was calculated using FILTSoft and the results were validated through prototype measurement for comparison purposes.

  18. Automated delineation and characterization of watersheds for more than 3,000 surface-water-quality monitoring stations active in 2010 in Texas

    USGS Publications Warehouse

    Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.

    2012-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.

  19. Software applications for flux balance analysis.

    PubMed

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  20. Geoinformatic subsystem for real estate market analysis). (Polish Title: Podsystem geoinformatyczny do analizy rynku nieruchomosci)

    NASA Astrophysics Data System (ADS)

    Basista, A.

    2013-12-01

    There are many tools to manage spatial data. They called Geographic Information System (GIS), which apart from data visualization in space, let users make various spatial analysis. Thanks to them, it is possible to obtain more, essential information for real estate market analysis. Many scientific research present GIS exploitation to future mass valuation, because it is necessary to use advanced tools to manage such a huge real estates' data sets gathered for mass valuation needs. In practice, appraisers use rarely these tools for single valuation, because there are not many available GIS tools to support real estate valuation. The paper presents the functionality of geoinformatic subsystem, that is used to support real estate market analysis and real estate valuation. There are showed a detailed description of the process relied to attributes' inputting into the database and the attributes' values calculation based on the proposed definition of attributes' scales. This work presents also the algorithm of similar properties selection that was implemented within the described subsystem. The main stage of this algorithm is the calculation of the price creative indicator for each real estate, using their attributes' values. The set of properties, chosen in this way, are visualized on the map. The geoinformatic subsystem is used for the un-built real estates and living premises. Geographic Information System software was used to worked out this project. The basic functionality of gvSIG software (open source software) was extended and some extra functions were added to support real estate market analysis.

  1. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  2. Remote Sensing and Capacity Building to Improve Food Security

    NASA Astrophysics Data System (ADS)

    Husak, G. J.; Funk, C. C.; Verdin, J. P.; Rowland, J.; Budde, M. E.

    2012-12-01

    The Famine Early Warning Systems Network (FEWS NET) is a U.S. Agency for International Development (USAID) supported project designed to monitor and anticipate food insecurity in the developing world, primarily Africa, Central America, the Caribbean and Central Asia. This is done through a network of partners involving U.S. government agencies, universities, country representatives, and partner institutions. This presentation will focus on the remotely sensed data used in FEWS NET activities and capacity building efforts designed to expand and enhance the use of FEWS NET tools and techniques. Remotely sensed data are of particular value in the developing world, where ground data networks and data reporting are limited. FEWS NET uses satellite based rainfall and vegetation greenness measures to monitor and assess food production conditions. Satellite rainfall estimates also drive crop models which are used in determining yield potential. Recent FEWS NET products also include estimates of actual evapotranspiration. Efforts are currently underway to assimilate these products into a single tool which would indicate areas experiencing abnormal conditions with implications for food production. FEWS NET is also involved in a number of capacity building activities. Two primary examples are the development of software and training of institutional partners in basic GIS and remote sensing. Software designed to incorporate rainfall station data with existing satellite-derived rainfall estimates gives users the ability to enhance satellite rainfall estimates or long-term means, resulting in gridded fields of rainfall that better reflect ground conditions. Further, this software includes a crop water balance model driven by the improved rainfall estimates. Finally, crop parameters, such as the planting date or length of growing period, can be adjusted by users to tailor the crop model to actual conditions. Training workshops in the use of this software, as well as basic GIS and remote sensing tools, are routinely conducted by FEWS NET representatives at host country meteorological and agricultural services. These institutions are then able to produce information that can more accurately inform food security decision making. Informed decision making reduces the risk associated with a given hazard. In the case of FEWS NET, this involves identification of shocks to food availability, allowing for the pre-positioning of aid to be available when a hazard strikes. Developing tools to incorporate better information in food production estimates and working closely with local staff trained in state-of-the-practice techniques results in a more informed decision making process, reducing the impacts of food security hazards.

  3. 3DRT-MPASS

    NASA Technical Reports Server (NTRS)

    Lickly, Ben

    2005-01-01

    Data from all current JPL missions are stored in files called SPICE kernels. At present, animators who want to use data from these kernels have to either read through the kernels looking for the desired data, or write programs themselves to retrieve information about all the needed objects for their animations. In this project, methods of automating the process of importing the data from the SPICE kernels were researched. In particular, tools were developed for creating basic scenes in Maya, a 3D computer graphics software package, from SPICE kernels.

  4. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  5. EPICS-based control and data acquisition for the APS slope profiler (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Sullivan, Joseph; Assoufid, Lahsen; Qian, Jun; Jemian, Peter R.; Mooney, Tim; Rivers, Mark L.; Goetze, Kurt; Sluiter, Ronald L.; Lang, Keenan

    2016-09-01

    The motion control, data acquisition and analysis system for APS Slope Measuring Profiler was implemented using the Experimental Physics and Industrial Control System (EPICS). EPICS was designed as a framework with software tools and applications that provide a software infrastructure used in building distributed control systems to operate devices such as particle accelerators, large experiments and major telescopes. EPICS was chosen to implement the APS Slope Measuring Profiler because it is also applicable to single purpose systems. The control and data handling capability available in the EPICS framework provides the basic functionality needed for high precision X-ray mirror measurement. Those built in capabilities include hardware integration of high-performance motion control systems (3-axis gantry and tip-tilt stages), mirror measurement devices (autocollimator, laser spot camera) and temperature sensors. Scanning the mirror and taking measurements was accomplished with an EPICS feature (the sscan record) which synchronizes motor positioning with measurement triggers and data storage. Various mirror scanning modes were automatically configured using EPICS built-in scripting. EPICS tools also provide low-level image processing (areaDetector). Operation screens were created using EPICS-aware GUI screen development tools.

  6. Value Addition to Cartosat-I Imagery

    NASA Astrophysics Data System (ADS)

    Mohan, M.

    2014-11-01

    In the sector of remote sensing applications, the use of stereo data is on the steady rise. An attempt is hereby made to develop a software suite specifically for exploitation of Cartosat-I data. A few algorithms to enhance the quality of basic Cartosat-I products will be presented. The algorithms heavily exploit the Rational Function Coefficients (RPCs) that are associated with the image. The algorithms include improving the geometric positioning through Bundle Block Adjustment and producing refined RPCs; generating portable stereo views using raw / refined RPCs autonomously; orthorectification and mosaicing; registering a monoscopic image rapidly with a single seed point. The outputs of these modules (including the refined RPCs) are in standard formats for further exploitation in 3rd party software. The design focus has been on minimizing the user-interaction and to customize heavily to suit the Indian context. The core libraries are in C/C++ and some of the applications come with user-friendly GUI. Further customization to suit a specific workflow is feasible as the requisite photogrammetric tools are in place and are continuously upgraded. The paper discusses the algorithms and the design considerations of developing the tools. The value-added products so produced using these tools will also be presented.

  7. Design and evaluation of a software prototype for participatory planning of environmental adaptations.

    PubMed

    Eriksson, J; Ek, A; Johansson, G

    2000-03-01

    A software prototype to support the planning process for adapting home and work environments for people with physical disabilities was designed and later evaluated. The prototype exploits low-cost three-dimensional (3-D) graphics products in the home computer market. The essential features of the prototype are: interactive rendering with optional hardware acceleration, interactive walk-throughs, direct manipulation tools for moving objects and measuring distances, and import of 3-D-objects from a library. A usability study was conducted, consisting of two test sessions (three weeks apart) and a final interview. The prototype was then tested and evaluated by representatives of future users: five occupational therapist students, and four persons with physical disability, with no previous experience of the prototype. Emphasis in the usability study was placed on the prototype's efficiency and learnability. We found that it is possible to realise a planning tool for environmental adaptations, both regarding usability and technical efficiency. The usability evaluation confirms our findings from previous case studies, regarding the relevance and positive attitude towards this kind of planning tool. Although the prototype was found to be satisfactorily efficient for the basic tasks, the paper presents several suggestions for improvement of future prototype versions.

  8. Astronaut Office Scheduling System Software

    NASA Technical Reports Server (NTRS)

    Brown, Estevancio

    2010-01-01

    AOSS is a highly efficient scheduling application that uses various tools to schedule astronauts weekly appointment information. This program represents an integration of many technologies into a single application to facilitate schedule sharing and management. It is a Windows-based application developed in Visual Basic. Because the NASA standard office automation load environment is Microsoft-based, Visual Basic provides AO SS developers with the ability to interact with Windows collaboration components by accessing objects models from applications like Outlook and Excel. This also gives developers the ability to create newly customizable components that perform specialized tasks pertaining to scheduling reporting inside the application. With this capability, AOSS can perform various asynchronous tasks, such as gathering/ sending/ managing astronauts schedule information directly to their Outlook calendars at any time.

  9. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  10. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Suganuma, Masahiro; Niwa, Yoshito; Sako, Nobutada; Hatsutori, Yoichi; Tanaka, Takashi

    2006-06-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations into error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). In this article, we explain the framework of the JASMINE simulator.

  11. Methods and Tools for Ethical Usability

    NASA Astrophysics Data System (ADS)

    Kavathatzopoulos, Iordanis; Kostrzewa, Agata; Laaksoharju, Mikael

    The objectives of the tutorial are to provide knowledge of basic ethical, psychological and organizational theories that are relevant to consider ethical aspects during design and use of IT systems; knowledge and skills about handling and solving ethical problems in connection with design and use of IT-systems; and skills in using questionnaires, surveys, interviews and the like in connection with software development and IT-use. It contains lectures, workshop and exercises; use of special tools to identify and consider IT ethical issues during planning, construction, installation and use of IT systems; and group exercises where the participants train their ethical skills on IT ethical conflicts and problems. Intended participants are system developers, purchasers, usability experts, academics, HCI teachers.

  12. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  13. Development of an Advanced Stimulation / Production Predictive Simulator for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritchett, John W.

    2015-04-15

    There are several well-known obstacles to the successful deployment of EGS projects on a commercial scale, of course. EGS projects are expected to be deeper, on the average, than conventional “natural” geothermal reservoirs, and drilling costs are already a formidable barrier to conventional geothermal projects. Unlike conventional resources (which frequently announce their presence with natural manifestations such as geysers, hot springs and fumaroles), EGS prospects are likely to appear fairly undistinguished from the earth surface. And, of course, the probable necessity of fabricating a subterranean fluid circulation network to mine the heat from the rock (instead of simply relying onmore » natural, pre-existing permeable fractures) adds a significant degree of uncertainty to the prospects for success. Accordingly, the basic motivation for the work presented herein was to try to develop a new set of tools that would be more suitable for this purpose. Several years ago, the Department of Energy’s Geothermal Technologies Office recognized this need and funded a cost-shared grant to our company (then SAIC, now Leidos) to partner with Geowatt AG of Zurich, Switzerland and undertake the development of a new reservoir simulator that would be more suitable for EGS forecasting than the existing tools. That project has now been completed and a new numerical geothermal reservoir simulator has been developed. It is named “HeatEx” (for “Heat Extraction”) and is almost completely new, although its methodology owes a great deal to other previous geothermal software development efforts, including Geowatt’s “HEX-S” code, the STAR and SPFRAC simulators developed here at SAIC/Leidos, the MINC approach originally developed at LBNL, and tracer analysis software originally formulated at INEL. Furthermore, the development effort was led by engineers with many years of experience in using reservoir simulation software to make meaningful forecasts for real geothermal projects, not just software designers. It is hoped that, as a result, HeatEx will prove useful during the early stages of the development of EGS technology. The basic objective was to design a tool that could use field data that are likely to become available during the early phases of an EGS project (that is, during initial reconnaissance and fracture stimulation operations) to guide forecasts of the longer-term behavior of the system during production and heat-mining.« less

  14. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  15. Basic Radar Altimetry Toolbox: Tools and Tutorial to Use Cryosat Data

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.; Niemeijer, S.

    2011-12-01

    Radar altimetry is very much a technique expanding its applications. Even If quite a lot of effort has been invested for oceanography users, the use of Altimetry data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious for new Altimetry data products users. ESA and CNES therfore developed the Basic Radar Altimetry Toolbox a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat, the future Saral missions and is ready for adaptation to Sentinel-3 products - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available since April 2007, and had been demonstrated during training courses and scientific meetings. About 2000 people downloaded it (Summer 2011), with many "newcomers" to altimetry among them, including teachers and professors, worldwide. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in the recent version release (v3.0.1). Others are in discussion for future development. Data use cases on CryoSat data use will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  16. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  17. SeaDataNet Pan-European infrastructure for Ocean & Marine Data Management

    NASA Astrophysics Data System (ADS)

    Manzella, G. M.; Maillard, C.; Maudire, G.; Schaap, D.; Rickards, L.; Nast, F.; Balopoulos, E.; Mikhailov, N.; Vladymyrov, V.; Pissierssens, P.; Schlitzer, R.; Beckers, J. M.; Barale, V.

    2007-12-01

    SEADATANET is developing a Pan-European data management infrastructure to insure access to a large number of marine environmental data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties), safeguard and long term archiving. Data are derived from many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. SeaDataNet allows to have information on real time and archived marine environmental data collected at a pan-european level, through directories on marine environmental data and projects. SeaDataNet allows the access to the most comprehensive multidisciplinary sets of marine in-situ and remote sensing data, from about 40 laboratories, through user friendly tools. The data selection and access is operated through the Common Data Index (CDI), XML files compliant with ISO standards and unified dictionaries. Technical Developments carried out by SeaDataNet includes: A library of Standards - Meta-data standards, compliant with ISO 19115, for communication and interoperability between the data platforms. Software of interoperable on line system - Interconnection of distributed data centres by interfacing adapted communication technology tools. Off-Line Data Management software - software representing the minimum equipment of all the data centres is developed by AWI "Ocean Data View (ODV)". Training, Education and Capacity Building - Training 'on the job' is carried out by IOC-Unesco in Ostende. SeaDataNet Virtual Educational Centre internet portal provides basic tools for informal education

  18. Software to Compare NPP HDF5 Data Files

    NASA Technical Reports Server (NTRS)

    Wiegand, Chiu P.; LeMoigne-Stewart, Jacqueline; Ruley, LaMont T.

    2013-01-01

    This software was developed for the NPOESS (National Polar-orbiting Operational Environmental Satellite System) Preparatory Project (NPP) Science Data Segment. The purpose of this software is to compare HDF5 (Hierarchical Data Format) files specific to NPP and report whether the HDF5 files are identical. If the HDF5 files are different, users have the option of printing out the list of differences in the HDF5 data files. The user provides paths to two directories containing a list of HDF5 files to compare. The tool would select matching HDF5 file names from the two directories and run the comparison on each file. The user can also select from three levels of detail. Level 0 is the basic level, which simply states whether the files match or not. Level 1 is the intermediate level, which lists the differences between the files. Level 2 lists all the details regarding the comparison, such as which objects were compared, and how and where they are different. The HDF5 tool is written specifically for the NPP project. As such, it ignores certain attributes (such as creation_date, creation_ time, etc.) in the HDF5 files. This is because even though two HDF5 files could represent exactly the same granule, if they are created at different times, the creation date and time would be different. This tool is smart enough to ignore differences that are not relevant to NPP users.

  19. Use of Google Earth to strengthen public health capacity and facilitate management of vector-borne diseases in resource-poor environments.

    PubMed

    Lozano-Fuentes, Saul; Elizondo-Quiroga, Darwin; Farfan-Ale, Jose Arturo; Loroño-Pino, Maria Alba; Garcia-Rejon, Julian; Gomez-Carro, Salvador; Lira-Zumbardo, Victor; Najera-Vazquez, Rosario; Fernandez-Salas, Ildefonso; Calderon-Martinez, Joaquin; Dominguez-Galera, Marco; Mis-Avila, Pedro; Morris, Natashia; Coleman, Michael; Moore, Chester G; Beaty, Barry J; Eisen, Lars

    2008-09-01

    Novel, inexpensive solutions are needed for improved management of vector-borne and other diseases in resource-poor environments. Emerging free software providing access to satellite imagery and simple editing tools (e.g. Google Earth) complement existing geographic information system (GIS) software and provide new opportunities for: (i) strengthening overall public health capacity through development of information for city infrastructures; and (ii) display of public health data directly on an image of the physical environment. We used freely accessible satellite imagery and a set of feature-making tools included in the software (allowing for production of polygons, lines and points) to generate information for city infrastructure and to display disease data in a dengue decision support system (DDSS) framework. Two cities in Mexico (Chetumal and Merida) were used to demonstrate that a basic representation of city infrastructure useful as a spatial backbone in a DDSS can be rapidly developed at minimal cost. Data layers generated included labelled polygons representing city blocks, lines representing streets, and points showing the locations of schools and health clinics. City blocks were colour-coded to show presence of dengue cases. The data layers were successfully imported in a format known as shapefile into a GIS software. The combination of Google Earth and free GIS software (e.g. HealthMapper, developed by WHO, and SIGEpi, developed by PAHO) has tremendous potential to strengthen overall public health capacity and facilitate decision support system approaches to prevention and control of vector-borne diseases in resource-poor environments.

  20. Managing Written Directives: A Software Solution to Streamline Workflow.

    PubMed

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases confusion about the creation, completion, filing, and retrieval of directives. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  1. Integrated Baseline System (IBS), Version 1. 03. [Chemical Stockpile Emergency Preparedness Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, B.M.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planing and analysis. This document is the user guide for the IBS and explains how to operate the IBS system. The fundamental function of the IBS is to provide tools that civilian emergency management personnel can use in developing emergency plans and in supporting emergency management activities to cope with a chemical-releasing event at a military chemical stockpile. Emergency management planners can evaluate concepts and ideas using the IBS system. The results of that experience can then be factoredmore » into refining requirements and plans. This document provides information for the general system user, and is the primary reference for the system features of the IBS. It is designed for persons who are familiar with general emergency management concepts, operations, and vocabulary. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other LBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary. IBS is a dynamic system. Its capabilities are in a state of continuing expansion and enhancement.« less

  2. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  3. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  4. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal effort by indicating problems and/or benefits of different approaches and designs.

  5. NASA's Advanced Multimission Operations System: A Case Study in Formalizing Software Architecture Evolution

    NASA Technical Reports Server (NTRS)

    Barnes, Jeffrey M.

    2011-01-01

    All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.

  6. Ground Processing of Data From the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Sturdevant, Kathryn; Noble, David

    2006-01-01

    A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.

  7. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  8. CaveMan Enterprise version 1.0 Software Validation and Verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, David

    The U.S. Department of Energy Strategic Petroleum Reserve stores crude oil in caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. The CaveMan software program has been used since the late 1990s as one tool to analyze pressure mea- surements monitored at each cavern. The purpose of this monitoring is to catch potential cavern integrity issues as soon as possible. The CaveMan software was written in Microsoft Visual Basic, and embedded in a Microsoft Excel workbook; this method of running the CaveMan software is no longer sustainable. As such, a new version called CaveMan Enter- prisemore » has been developed. CaveMan Enterprise version 1.0 does not have any changes to the CaveMan numerical models. CaveMan Enterprise represents, instead, a change from desktop-managed work- books to an enterprise framework, moving data management into coordinated databases and porting the numerical modeling codes into the Python programming language. This document provides a report of the code validation and verification testing.« less

  9. A microkernel design for component-based parallel numerical software systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.

    1999-01-13

    What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less

  10. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  11. Sensor control of robot arc welding

    NASA Technical Reports Server (NTRS)

    Sias, F. R., Jr.

    1985-01-01

    A basic problem in the application of robots for welding which is how to guide a torch along a weld seam using sensory information was studied. Improvement of the quality and consistency of certain Gas Tungsten Arc welds on the Space Shuttle Main Engine (SSME) that are too complex geometrically for conventional automation and therefore are done by hand was examined. The particular problems associated with space shuttle main egnine (SSME) manufacturing and weld-seam tracking with an emphasis on computer vision methods were analyzed. Special interface software for the MINC computr are developed which will allow it to be used both as a test system to check out the robot interface software and later as a development tool for further investigation of sensory systems to be incorporated in welding procedures.

  12. A straightforward graphical user interface for basic and advanced signal processing of thermographic infrared sequences

    NASA Astrophysics Data System (ADS)

    Klein, Matthieu T.; Ibarra-Castanedo, Clemente; Maldague, Xavier P.; Bendada, Abdelhakim

    2008-03-01

    IR-View, is a free and open source Matlab software that was released in 1998 at the Computer Vision and Systems Laboratory (CVSL) at Université Laval, Canada, as an answer to many common and recurrent needs in Infrared thermography. IR-View has proven to be a useful tool at CVSL for the past 10 years. The software by itself and/or its concept and functions may be of interest for other laboratories and companies working in research in the IR NDT field. This article describes the functions and processing techniques integrated to IR-View, freely downloadable under the GNU license at http://mivim.gel.ulaval.ca. Demonstration of IR-View functionalities will also be done during the DSS08 SPIE Defense and Security Symposium.

  13. Learning motion concepts using real-time microcomputer-based laboratory tools

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald K.; Sokoloff, David R.

    1990-09-01

    Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.

  14. Using articulation and inscription as catalysts for reflection: Design principles for reflective inquiry

    NASA Astrophysics Data System (ADS)

    Loh, Ben Tun-Bin

    2003-07-01

    The demand for students to engage in complex student-driven and information-rich inquiry investigations poses challenges to existing learning environments. Students are not familiar with this style of work, and lack the skills, tools, and expectations it demands, often forging blindly forward in the investigation. If students are to be successful, they need to learn to be reflective inquirers, periodically stepping back from an investigation to evaluate their work. The fundamental goal of my dissertation is to understand how to design learning environments to promote and support reflective inquiry. I have three basic research questions: how to define this mode of work, how to help students learn it, and understanding how it facilitates reflection when enacted in a classroom. I take an exploratory approach in which, through iterative cycles of design, development, and reflection, I develop principles of design for reflective inquiry, instantiate those principles in the design of a software environment, and test that software in the context of classroom work. My work contributes to the understanding of reflective inquiry in three ways: First, I define a task model that describes the kinds of operations (cognitive tasks) that students should engage in as reflective inquirers. These operations are defined in terms of two basic tasks: articulation and inscription, which serve as catalysts for externalizing student thinking as objects of and triggers for reflection. Second, I instantiate the task model in the design of software tools (the Progress Portfolio). And, through proof of concept pilot studies, I examine how the task model and tools helped students with their investigative classroom work. Finally, I take a step back from these implementations and articulate general design principles for reflective inquiry with the goal of informing the design of other reflective inquiry learning environments. There are three design principles: (1) Provide a designated work space for reflection activities to focus student attention on reflection. (2) Help students create and use artifacts that represent their work and their thinking as a means to create referents for reflection. (3) Support and take advantage of social processes that help students reflect on their own work.

  15. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  16. XML schemas for common bioinformatic data types and their application in workflow systems

    PubMed Central

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-01-01

    Background Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data – therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Results Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at , the BioDOM library can be obtained at . Conclusion The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios. PMID:17087823

  17. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  18. Quadratic Blind Linear Unmixing: A Graphical User Interface for Tissue Characterization

    PubMed Central

    Gutierrez-Navarro, O.; Campos-Delgado, D.U.; Arce-Santana, E. R.; Jo, Javier A.

    2016-01-01

    Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. PMID:26589467

  19. Quadratic blind linear unmixing: A graphical user interface for tissue characterization.

    PubMed

    Gutierrez-Navarro, O; Campos-Delgado, D U; Arce-Santana, E R; Jo, Javier A

    2016-02-01

    Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. PhysioNet: physiologic signals, time series and related open source software for basic, clinical, and applied research.

    PubMed

    Moody, George B; Mark, Roger G; Goldberger, Ary L

    2011-01-01

    PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.

  1. Differential gene and transcript expression analysis of RNA-seq experiments with TopHat and Cufflinks

    PubMed Central

    Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior

    2012-01-01

    Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036

  2. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  3. Basic Radar Altimetry Toolbox: tools to teach altimetry for ocean

    NASA Astrophysics Data System (ADS)

    Rosmorduc, Vinca; Benveniste, Jerome; Bronner, Emilie; Niemeijer, Sander; Lucas, Bruno Manuel; Dinardo, Salvatore

    2013-04-01

    The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, CryoSat. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (January 2013), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are in discussion for the future, including addition of the future Sentinel-3. The Basic Radar Altimetry Toolbox is able: - to read most distributed radar altimetry data, including the one from future missions like Saral, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways, including as an educational tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. Example from education uses will be presented, and feedback from those who used it as such will be most welcome. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  4. Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine

    NASA Astrophysics Data System (ADS)

    Obuchowski, Nancy A.; Bullen, Jennifer A.

    2018-04-01

    Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.

  5. STS Case Study Development Support

    NASA Technical Reports Server (NTRS)

    Rosa de Jesus, Dan A.; Johnson, Grace K.

    2013-01-01

    The Shuttle Case Study Collection (SCSC) has been developed using lessons learned documented by NASA engineers, analysts, and contractors. The SCSC provides educators with a new tool to teach real-world engineering processes with the goal of providing unique educational materials that enhance critical thinking, decision-making and problem-solving skills. During this third phase of the project, responsibilities included: the revision of the Hyper Text Markup Language (HTML) source code to ensure all pages follow World Wide Web Consortium (W3C) standards, and the addition and edition of website content, including text, documents, and images. Basic HTML knowledge was required, as was basic knowledge of photo editing software, and training to learn how to use NASA's Content Management System for website design. The outcome of this project was its release to the public.

  6. Dental Informatics tool "SOFPRO" for the study of oral submucous fibrosis.

    PubMed

    Erlewad, Dinesh Masajirao; Mundhe, Kalpana Anandrao; Hazarey, Vinay K

    2016-01-01

    Dental informatics is an evolving branch widely used in dental education and practice. Numerous applications that support clinical care, education and research have been developed. However, very few such applications are developed and utilized in the epidemiological studies of oral submucous fibrosis (OSF) which is affecting a significant population of Asian countries. To design and develop an user friendly software for the descriptive epidemiological study of OSF. With the help of a software engineer a computer program SOFPRO was designed and developed by using, Ms-Visual Basic 6.0 (VB), Ms-Access 2000, Crystal Report 7.0 and Ms-Paint in operating system XP. For the analysis purpose the available OSF data from the departmental precancer registry was fed into the SOFPRO. Known data, not known and null data are successfully accepted in data entry and represented in data analysis of OSF. Smooth working of SOFPRO and its correct data flow was tested against real-time data of OSF. SOFPRO was found to be a user friendly automated tool for easy data collection, retrieval, management and analysis of OSF patients.

  7. A Modular Repository-based Infrastructure for Simulation Model Storage and Execution Support in the Context of In Silico Oncology and In Silico Medicine.

    PubMed

    Christodoulou, Nikolaos A; Tousert, Nikolaos E; Georgiadi, Eleni Ch; Argyri, Katerina D; Misichroni, Fay D; Stamatakos, Georgios S

    2016-01-01

    The plethora of available disease prediction models and the ongoing process of their application into clinical practice - following their clinical validation - have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features.

  8. bioWeb3D: an online webGL 3D data visualisation tool.

    PubMed

    Pettit, Jean-Baptiste; Marioni, John C

    2013-06-07

    Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets.

  9. A Modular Repository-based Infrastructure for Simulation Model Storage and Execution Support in the Context of In Silico Oncology and In Silico Medicine

    PubMed Central

    Christodoulou, Nikolaos A.; Tousert, Nikolaos E.; Georgiadi, Eleni Ch.; Argyri, Katerina D.; Misichroni, Fay D.; Stamatakos, Georgios S.

    2016-01-01

    The plethora of available disease prediction models and the ongoing process of their application into clinical practice – following their clinical validation – have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features. PMID:27812280

  10. The New Meteor Radar at Penn State: Design and First Observations

    NASA Technical Reports Server (NTRS)

    Urbina, J.; Seal, R.; Dyrud, L.

    2011-01-01

    In an effort to provide new and improved meteor radar sensing capabilities, Penn State has been developing advanced instruments and technologies for future meteor radars, with primary objectives of making such instruments more capable and more cost effective in order to study the basic properties of the global meteor flux, such as average mass, velocity, and chemical composition. Using low-cost field programmable gate arrays (FPGAs), combined with open source software tools, we describe a design methodology enabling one to develop state-of-the art radar instrumentation, by developing a generalized instrumentation core that can be customized using specialized output stage hardware. Furthermore, using object-oriented programming (OOP) techniques and open-source tools, we illustrate a technique to provide a cost-effective, generalized software framework to uniquely define an instrument s functionality through a customizable interface, implemented by the designer. The new instrument is intended to provide instantaneous profiles of atmospheric parameters and climatology on a daily basis throughout the year. An overview of the instrument design concepts and some of the emerging technologies developed for this meteor radar are presented.

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Mathematics and Computer Education, 1987

    1987-01-01

    Presented are reviews of several microcomputer software programs. Included are reviews of: (1) Microstat (Zenith); (2) MathCAD (MathSoft); (3) Discrete Mathematics (True Basic); (4) CALCULUS (True Basic); (5) Linear-Kit (John Wiley); and (6) Geometry Sensei (Broderbund). (RH)

  12. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert

    1985-01-01

    The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.

  13. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  14. Investigation of roughing machining simulation by using visual basic programming in NX CAM system

    NASA Astrophysics Data System (ADS)

    Hafiz Mohamad, Mohamad; Nafis Osman Zahid, Muhammed

    2018-03-01

    This paper outlines a simulation study to investigate the characteristic of roughing machining simulation in 4th axis milling processes by utilizing visual basic programming in NX CAM systems. The selection and optimization of cutting orientation in rough milling operation is critical in 4th axis machining. The main purpose of roughing operation is to approximately shape the machined parts into finished form by removing the bulk of material from workpieces. In this paper, the simulations are executed by manipulating a set of different cutting orientation to generate estimated volume removed from the machine parts. The cutting orientation with high volume removal is denoted as an optimum value and chosen to execute a roughing operation. In order to run the simulation, customized software is developed to assist the routines. Operations build-up instructions in NX CAM interface are translated into programming codes via advanced tool available in the Visual Basic Studio. The codes is customized and equipped with decision making tools to run and control the simulations. It permits the integration with any independent program files to execute specific operations. This paper aims to discuss about the simulation program and identifies optimum cutting orientations for roughing processes. The output of this study will broaden up the simulation routines performed in NX CAM systems.

  15. Making the EZ Choice

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Analytical Mechanics Associates, Inc. (AMA), of Hampton, Virginia, created the EZopt software application through Small Business Innovation Research (SBIR) funding from NASA's Langley Research Center. The new software is a user-friendly tool kit that provides quick and logical solutions to complex optimal control problems. In its most basic form, EZopt converts process data into math equations and then proceeds to utilize those equations to solve problems within control systems. EZopt successfully proved its advantage when applied to short-term mission planning and onboard flight computer implementation. The technology has also solved multiple real-life engineering problems faced in numerous commercial operations. For instance, mechanical engineers use EZopt to solve control problems with robots, while chemical plants implement the application to overcome situations with batch reactors and temperature control. In the emerging field of commercial aerospace, EZopt is able to optimize trajectories for launch vehicles and perform potential space station- keeping tasks. Furthermore, the software also helps control electromagnetic devices in the automotive industry.

  16. Software-centric View on OVMS for LBT

    NASA Astrophysics Data System (ADS)

    Trowitzsch, J.; Borelli, J.; Pott, J.; Kürster, M.

    2012-09-01

    The performance of infrared interferometry (IF) and adaptive optics (AO) strongly depends on the mitigation and correction of telescope vibrations. Therefore, at the Large Binocular Telescope (LBT) the OVMS, the Optical Path Difference and Vibration Monitoring System, is being installed. It is meant to ensure suitable conditions for adaptive optics and interferometry. The vibration information is collected from accelerometers that are distributed over the optical elements of the LBT. The collected vibration measurements are converted into tip-tilt and optical path difference data. That data is utilized in the control strategies of the LBT adaptive secondary mirrors and the beam combining interferometers, LINC-NIRVANA and LBTI. Within the OVMS the software part is responsibility of the LINC-NIRVANA team at MPIA Heidelberg. It comprises the software for the real-time data acquisition from the accelerometers as well as the related telemetry interface and the vibration monitoring quick look tools. The basic design ideas, implementation details and special features are explained here.

  17. Development of a comprehensive software engineering environment

    NASA Technical Reports Server (NTRS)

    Hartrum, Thomas C.; Lamont, Gary B.

    1987-01-01

    The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.

  18. Browsing software of the Visible Korean data used for teaching sectional anatomy.

    PubMed

    Shin, Dong Sun; Chung, Min Suk; Park, Hyo Seok; Park, Jin Seo; Hwang, Sung Bae

    2011-01-01

    The interpretation of computed tomographs (CTs) and magnetic resonance images (MRIs) to diagnose clinical conditions requires basic knowledge of sectional anatomy. Sectional anatomy has traditionally been taught using sectioned cadavers, atlases, and/or computer software. The computer software commonly used for this subject is practical and efficient for students but could be more advanced. The objective of this research was to present browsing software developed from the Visible Korean images that can be used for teaching sectional anatomy. One thousand seven hundred and two sets of MRIs, CTs, and sectioned images (intervals, one millimeter) of a whole male cadaver were prepared. Over 900 structures in the sectioned images were outlined and then filled with different colors to elaborate each structure. Software was developed where four corresponding images could be displayed simultaneously; in addition, the structures in the image data could be readily recognized with the aid of the color-filled outlines. The software, distributed free of charge, could be a valuable tool to teach medical students. For example, sectional anatomy could be taught by showing the sectioned images with real color and high resolution. Students could then review the lecture by using the sectioned and color-filled images on their own computers. Students could also be evaluated using the same software. Furthermore, other investigators would be able to replace the images for more comprehensive sectional anatomy. Copyright © 2011 Wiley-Liss, Inc.

  19. CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 5, May 2007

    DTIC Science & Technology

    2007-05-01

    zation Program. Washington: GSA, DoD, and NASA , 2005 <http:// www.arnet.gov/far/>. 11. Department of Commerce. NIST. FIPS Pub 200, Minimum Security...on this Web site. The NASA Goddard Space Flight Center (GSFC) SwA http://sw-assurance.gsfc.nasa.gov The NASA GSFC SwA Web site pro- vides tools...OCT2006 c STAR WARS TO STAR TREK NOV2006 c MANAGEMENT BASICS DEC2006 c REQUIREMENTS ENG. JAN2007 c PUBLISHER’S CHOICE FEB2007 c CMMI MAR2007 c

  20. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  1. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.

  2. Evolving software reengineering technology for the emerging innovative-competitive era

    NASA Technical Reports Server (NTRS)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.

  3. Survey of basic medical researchers on the awareness of animal experimental designs and reporting standards in China.

    PubMed

    Ma, Bin; Xu, Jia-Ke; Wu, Wen-Jing; Liu, Hong-Yan; Kou, Cheng-Kun; Liu, Na; Zhao, Lulu

    2017-01-01

    To investigate the awareness and use of the Systematic Review Center for Laboratory Animal Experimentation's (SYRCLE) risk-of-bias tool, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) reporting guidelines, and Gold Standard Publication Checklist (GSPC) in China in basic medical researchers of animal experimental studies. A national questionnaire-based survey targeting basic medical researchers was carried in China to investigate the basic information and awareness of SYRCLE's risk of bias tool, ARRIVE guidelines, GSPC, and animal experimental bias risk control factors. The EpiData3.1 software was used for data entry, and Microsoft Excel 2013 was used for statistical analysis in this study. The number of cases (n) and percentage (%) of classified information were statistically described, and the comparison between groups (i.e., current students vs. research staff) was performed using chi-square test. A total of 298 questionnaires were distributed, and 272 responses were received, which included 266 valid questionnaires (from 118 current students and 148 research staff). Among the 266 survey participants, only 15.8% was aware of the SYRCLE's risk of bias tool, with significant difference between the two groups (P = 0.003), and the awareness rates of ARRIVE guidelines and GSPC were only 9.4% and 9.0%, respectively; 58.6% survey participants believed that the reports of animal experimental studies in Chinese literature were inadequate, with significant difference between the two groups (P = 0.004). In addition, only approximately 1/3 of the survey participants had read systematic reviews and meta-analysis reports of animal experimental studies; only 16/266 (6.0%) had carried out/participated in and 11/266 (4.1%) had published systematic reviews/meta-analysis of animal experimental studies. The awareness and use rates of SYRCLE's risk-of-bias tool, the ARRIVE guidelines, and the GSPC were low among Chinese basic medical researchers. Therefore, specific measures are necessary to promote and popularize these standards and specifications and to introduce these standards into guidelines of Chinese domestic journals as soon as possible to raise awareness and increase use rates of researchers and journal editors, thereby improving the quality of animal experimental methods and reports.

  4. MASTOS: Mammography Simulation Tool for design Optimization Studies.

    PubMed

    Spyrou, G; Panayiotakis, G; Tzanakos, G

    2000-01-01

    Mammography is a high quality imaging technique for the detection of breast lesions, which requires dedicated equipment and optimum operation. The design parameters of a mammography unit have to be decided and evaluated before the construction of such a high cost of apparatus. The optimum operational parameters also must be defined well before the real breast examination. MASTOS is a software package, based on Monte Carlo methods, that is designed to be used as a simulation tool in mammography. The input consists of the parameters that have to be specified when using a mammography unit, and also the parameters specifying the shape and composition of the breast phantom. In addition, the input may specify parameters needed in the design of a new mammographic apparatus. The main output of the simulation is a mammographic image and calculations of various factors that describe the image quality. The Monte Carlo simulation code is PC-based and is driven by an outer shell of a graphical user interface. The entire software package is a simulation tool for mammography and can be applied in basic research and/or in training in the fields of medical physics and biomedical engineering as well as in the performance evaluation of new designs of mammography units and in the determination of optimum standards for the operational parameters of a mammography unit.

  5. Visualization: a tool for enhancing students' concept images of basic object-oriented concepts

    NASA Astrophysics Data System (ADS)

    Cetin, Ibrahim

    2013-03-01

    The purpose of this study was twofold: to investigate students' concept images about class, object, and their relationship and to help them enhance their learning of these notions with a visualization tool. Fifty-six second-year university students participated in the study. To investigate his/her concept images, the researcher developed a survey including open-ended questions, which was administered to the participants. Follow-up interviews with 12 randomly selected students were conducted to explore their answers to the survey in depth. The results of the first part of the research were utilized to construct visualization scenarios. The students used these scenarios to develop animations using Flash software. The study found that most of the students experienced difficulties in learning object-oriented notions. Overdependence on code-writing practice and examples and incorrectly learned analogies were determined to be the sources of their difficulties. Moreover, visualization was found to be a promising approach in facilitating students' concept images of basic object-oriented notions. The results of this study have implications for researchers and practitioners when designing programming instruction.

  6. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  7. Design of two-channel oscilloscope and basic circuit simulations in LabView

    NASA Astrophysics Data System (ADS)

    Balzhiev, Plamen; Makal, Jaroslaw

    2008-01-01

    The project is realized as a diploma thesis in Bialystok Technical University, Poland). The main aim is to develop a useful educational tool which presents the time and frequency characteristics in basic electrical circuits. It is designed as a helpful instrument for lectures and laboratory classes. The predominant audience will be students of electrical engineering from first semester of the higher education. Therefore the level of knowledge at this stage of education is not high enough and different techniques are necessary to increase the students' interest and the efficiency of teaching process. This educational instrument provides the needed knowledge concerning the basic circuits and its parameters. Graphics and animations of the general processes in the electrical circuits make the problems more interesting, comprehensive and easier to understand. For designing such an instrument the National Instruments' programming environment LabView is used. It is preferred to the other simulation software because of its simplicity flexibility and also availability (the free demo version is sufficient to make a simple virtual instrument). LabView uses graphical programming language and has powerful mathematical functions for analysis and simulations. The useful visualization tools for presenting different diagrams are worth recommending, too. It is also specialized in measurement and control and it supports a wide variety of hardware. Therefore this software is suitable for laboratory classes to present the dependencies between the simulated characteristics in basic electrical circuits and the real one measured with the hardware device. For this purpose a two-channel oscilloscope is designed as part of the described project. The main purpose of this instrument as part of the educational process is to present the desired characteristics of the electrical circuits and to become familiar with the general functions of the oscilloscope. This project combines several important features appropriate for teaching purposes: well presented information with graphics, easy to operate with and giving the necessary knowledge. This method of teaching is more interesting and attractive to the audience. Also the information is assimilated more quickly, with less effort.

  8. VIP Barcoding: composition vector-based software for rapid species identification based on DNA barcoding.

    PubMed

    Fan, Long; Hui, Jerome H L; Yu, Zu Guo; Chu, Ka Hou

    2014-07-01

    Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/. © 2014 John Wiley & Sons Ltd.

  9. An Automated Method for Identifying Inconsistencies within Diagrammatic Software Requirements Specifications

    NASA Technical Reports Server (NTRS)

    Zhang, Zhong

    1997-01-01

    The development of large-scale, composite software in a geographically distributed environment is an evolutionary process. Often, in such evolving systems, striving for consistency is complicated by many factors, because development participants have various locations, skills, responsibilities, roles, opinions, languages, terminology and different degrees of abstraction they employ. This naturally leads to many partial specifications or viewpoints. These multiple views on the system being developed usually overlap. From another aspect, these multiple views give rise to the potential for inconsistency. Existing CASE tools do not efficiently manage inconsistencies in distributed development environment for a large-scale project. Based on the ViewPoints framework the WHERE (Web-Based Hypertext Environment for requirements Evolution) toolkit aims to tackle inconsistency management issues within geographically distributed software development projects. Consequently, WHERE project helps make more robust software and support software assurance process. The long term goal of WHERE tools aims to the inconsistency analysis and management in requirements specifications. A framework based on Graph Grammar theory and TCMJAVA toolkit is proposed to detect inconsistencies among viewpoints. This systematic approach uses three basic operations (UNION, DIFFERENCE, INTERSECTION) to study the static behaviors of graphic and tabular notations. From these operations, subgraphs Query, Selection, Merge, Replacement operations can be derived. This approach uses graph PRODUCTIONS (rewriting rules) to study the dynamic transformations of graphs. We discuss the feasibility of implementation these operations. Also, We present the process of porting original TCM (Toolkit for Conceptual Modeling) project from C++ to Java programming language in this thesis. A scenario based on NASA International Space Station Specification is discussed to show the applicability of our approach. Finally, conclusion and future work about inconsistency management issues in WHERE project will be summarized.

  10. [Dietopro.com: a new tool for dietotherapeutical management based on cloud computing technology].

    PubMed

    García, Candido Gabriel; Sebastià, Natividad; Blasco, Esther; Soriano, José Miguel

    2014-09-01

    dietotherapeutical softwares are now a basic tool in the dietary management of patients, either from a physiological point of view and / or pathological. New technologies and research in this regard, have favored the emergence of new applications for the dietary and nutritional management that facilitate the management of the dietotherapeutical company. To comparatively study the main dietotherapeutical applications on the market to give criteria to the professional users of diet and nutrition in the selection of one of the main tools for these. Dietopro.com is, from our point of view, one of the most comprehensive management of patients dietotherapeutical applications. Based on the need of the user, it has different dietary sofwares choice.We conclude that there is no better or worse than another application, but applications roughly adapted to the needs of professionals. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  11. FAMIAS - A userfriendly new software tool for the mode identification of photometric and spectroscopic times series

    NASA Astrophysics Data System (ADS)

    Zima, W.

    2008-12-01

    FAMIAS (Frequency Analysis and Mode Identification for AsteroSeismology) is a collection of state-of-the-art software tools for the analysis of photometric and spectroscopic time series data. It is one of the deliverables of the Work Package NA5: Asteroseismology of the European Coordination Action in Helio- and Asteroseismology (HELAS1 ). Two main sets of tools are incorporated in FAMIAS. The first set allows to search for pe- riodicities in the data using Fourier and non-linear least-squares fitting algorithms. The other set allows to carry out a mode identification for the detected pulsation frequencies to deter- mine their pulsational quantum numbers, the harmonic degree, ℓ, and the azimuthal order, m. For the spectroscopic mode identification, the Fourier parameter fit method and the moment method are available. The photometric mode identification is based on pre-computed grids of atmospheric parameters and non-adiabatic observables, and uses the method of amplitude ratios and phase differences in different filters. The types of stars to which FAMIAS is appli- cable are main-sequence pulsators hotter than the Sun. This includes the Gamma Dor stars, Delta Sct stars, the slowly pulsating B stars and the Beta Cep stars - basically all pulsating main-sequence stars, for which empirical mode identification is required to successfully carry out asteroseismology. The complete manual for FAMIAS is published in a special issue of Communications in Asteroseismology, Vol 155. The homepage of FAMIAS2 provides the possibility to download the software and to read the on-line documentation.

  12. ISOT_Calc: A versatile tool for parameter estimation in sorption isotherms

    NASA Astrophysics Data System (ADS)

    Beltrán, José L.; Pignatello, Joseph J.; Teixidó, Marc

    2016-09-01

    Geochemists and soil chemists commonly use parametrized sorption data to assess transport and impact of pollutants in the environment. However, this evaluation is often hampered by a lack of detailed sorption data analysis, which implies further non-accurate transport modeling. To this end, we present a novel software tool to precisely analyze and interpret sorption isotherm data. Our developed tool, coded in Visual Basic for Applications (VBA), operates embedded within the Microsoft Excel™ environment. It consists of a user-defined function named ISOT_Calc, followed by a supplementary optimization Excel macro (Ref_GN_LM). The ISOT_Calc function estimates the solute equilibrium concentration in the aqueous and solid phases (Ce and q, respectively). Hence, it represents a very flexible way in the optimization of the sorption isotherm parameters, as it can be carried out over the residuals of q, Ce, or both simultaneously (i.e., orthogonal distance regression). The developed function includes the most usual sorption isotherm models, as predefined equations, as well as the possibility to easily introduce custom-defined ones. Regarding the Ref_GN_LM macro, it allows the parameter optimization by using a Levenberg-Marquardt modified Gauss-Newton iterative procedure. In order to evaluate the performance of the presented tool, both function and optimization macro have been applied to different sorption data examples described in the literature. Results showed that the optimization of the isotherm parameters was successfully achieved in all cases, indicating the robustness and reliability of the developed tool. Thus, the presented software tool, available to researchers and students for free, has proven to be a user-friendly and an interesting alternative to conventional fitting tools used in sorption data analysis.

  13. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  14. Use of a data warehouse at an academic medical center for clinical pathology quality improvement, education, and research

    PubMed Central

    Krasowski, Matthew D.; Schriever, Andy; Mathur, Gagan; Blau, John L.; Stauffer, Stephanie L.; Ford, Bradley A.

    2015-01-01

    Background: Pathology data contained within the electronic health record (EHR), and laboratory information system (LIS) of hospitals represents a potentially powerful resource to improve clinical care. However, existing reporting tools within commercial EHR and LIS software may not be able to efficiently and rapidly mine data for quality improvement and research applications. Materials and Methods: We present experience using a data warehouse produced collaboratively between an academic medical center and a private company. The data warehouse contains data from the EHR, LIS, admission/discharge/transfer system, and billing records and can be accessed using a self-service data access tool known as Starmaker. The Starmaker software allows users to use complex Boolean logic, include and exclude rules, unit conversion and reference scaling, and value aggregation using a straightforward visual interface. More complex queries can be achieved by users with experience with Structured Query Language. Queries can use biomedical ontologies such as Logical Observation Identifiers Names and Codes and Systematized Nomenclature of Medicine. Result: We present examples of successful searches using Starmaker, falling mostly in the realm of microbiology and clinical chemistry/toxicology. The searches were ones that were either very difficult or basically infeasible using reporting tools within the EHR and LIS used in the medical center. One of the main strengths of Starmaker searches is rapid results, with typical searches covering 5 years taking only 1–2 min. A “Run Count” feature quickly outputs the number of cases meeting criteria, allowing for refinement of searches before downloading patient-identifiable data. The Starmaker tool is available to pathology residents and fellows, with some using this tool for quality improvement and scholarly projects. Conclusion: A data warehouse has significant potential for improving utilization of clinical pathology testing. Software that can access data warehouse using a straightforward visual interface can be incorporated into pathology training programs. PMID:26284156

  15. Use of a data warehouse at an academic medical center for clinical pathology quality improvement, education, and research.

    PubMed

    Krasowski, Matthew D; Schriever, Andy; Mathur, Gagan; Blau, John L; Stauffer, Stephanie L; Ford, Bradley A

    2015-01-01

    Pathology data contained within the electronic health record (EHR), and laboratory information system (LIS) of hospitals represents a potentially powerful resource to improve clinical care. However, existing reporting tools within commercial EHR and LIS software may not be able to efficiently and rapidly mine data for quality improvement and research applications. We present experience using a data warehouse produced collaboratively between an academic medical center and a private company. The data warehouse contains data from the EHR, LIS, admission/discharge/transfer system, and billing records and can be accessed using a self-service data access tool known as Starmaker. The Starmaker software allows users to use complex Boolean logic, include and exclude rules, unit conversion and reference scaling, and value aggregation using a straightforward visual interface. More complex queries can be achieved by users with experience with Structured Query Language. Queries can use biomedical ontologies such as Logical Observation Identifiers Names and Codes and Systematized Nomenclature of Medicine. We present examples of successful searches using Starmaker, falling mostly in the realm of microbiology and clinical chemistry/toxicology. The searches were ones that were either very difficult or basically infeasible using reporting tools within the EHR and LIS used in the medical center. One of the main strengths of Starmaker searches is rapid results, with typical searches covering 5 years taking only 1-2 min. A "Run Count" feature quickly outputs the number of cases meeting criteria, allowing for refinement of searches before downloading patient-identifiable data. The Starmaker tool is available to pathology residents and fellows, with some using this tool for quality improvement and scholarly projects. A data warehouse has significant potential for improving utilization of clinical pathology testing. Software that can access data warehouse using a straightforward visual interface can be incorporated into pathology training programs.

  16. Trajectory design strategies that incorporate invariant manifolds and swingby

    NASA Technical Reports Server (NTRS)

    Guzman, J. J.; Cooley, D. S.; Howell, K. C.; Folta, D. C.

    1998-01-01

    Libration point orbits serve as excellent platforms for scientific investigations involving the Sun as well as planetary environments. Trajectory design in support of such missions is increasingly challenging as more complex missions are envisioned in the next few decades. Software tools for trajectory design in this regime must be further developed to incorporate better understanding of the solution space and, thus, improve the efficiency and expand the capabilities of current approaches. Only recently applied to trajectory design, dynamical systems theory now offers new insights into the natural dynamics associated with the multi-body problem. The goal of this effort is the blending of analysis from dynamical systems theory with the well established NASA Goddard software program SWINGBY to enhance and expand the capabilities for mission design. Basic knowledge concerning the solution space is improved as well.

  17. A dual-waveband dynamic IR scene projector based on DMD

    NASA Astrophysics Data System (ADS)

    Hu, Yu; Zheng, Ya-wei; Gao, Jiao-bo; Sun, Ke-feng; Li, Jun-na; Zhang, Lei; Zhang, Fang

    2016-10-01

    Infrared scene simulation system can simulate multifold objects and backgrounds to perform dynamic test and evaluate EO detecting system in the hardware in-the-loop test. The basic structure of a dual-waveband dynamic IR scene projector was introduced in the paper. The system's core device is an IR Digital Micro-mirror Device (DMD) and the radiant source is a mini-type high temperature IR plane black-body. An IR collimation optical system which transmission range includes 3-5μm and 8-12μm is designed as the projection optical system. Scene simulation software was developed with Visual C++ and Vega soft tools and a software flow chart was presented. The parameters and testing results of the system were given, and this system was applied with satisfying performance in an IR imaging simulation testing.

  18. Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John

    2014-02-21

    The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less

  19. DATA-MEAns: an open source tool for the classification and management of neural ensemble recordings.

    PubMed

    Bonomini, María P; Ferrandez, José M; Bolea, Jose Angel; Fernandez, Eduardo

    2005-10-30

    The number of laboratories using techniques that allow to acquire simultaneous recordings of as many units as possible is considerably increasing. However, the development of tools used to analyse this multi-neuronal activity is generally lagging behind the development of the tools used to acquire these data. Moreover, the data exchange between research groups using different multielectrode acquisition systems is hindered by commercial constraints such as exclusive file structures, high priced licenses and hard policies on intellectual rights. This paper presents a free open-source software for the classification and management of neural ensemble data. The main goal is to provide a graphical user interface that links the experimental data to a basic set of routines for analysis, visualization and classification in a consistent framework. To facilitate the adaptation and extension as well as the addition of new routines, tools and algorithms for data analysis, the source code and documentation are freely available.

  20. PlanetPack: A radial-velocity time-series analysis tool facilitating exoplanets detection, characterization, and dynamical simulations

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-08-01

    We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.

  1. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  2. RiskScape: a new tool for comparing risk from natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Stirling, M. W.; King, A.

    2010-12-01

    The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.

  3. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    PubMed

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  4. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  5. Strategies for Using Plagiarism Software in the Screening of Incoming Journal Manuscripts: Recommendations Based on a Recent Literature Survey.

    PubMed

    Lykkesfeldt, Jens

    2016-08-01

    In recent years, several online tools have appeared capable of identifying potential plagiarism in science. While such tools may help to maintain or even increase the originality and ethical quality of the scientific literature, no apparent consensus exists among editors on the degree of plagiarism or self-plagiarism necessary to reject or retract manuscripts. In this study, two entire volumes of published original papers and reviews from Basic & Clinical Pharmacology & Toxicology were retrospectively scanned for similarity in anonymized form using iThenticate software to explore measures to predictively identify true plagiarism and self-plagiarism and to potentially provide guidelines for future screening of incoming manuscripts. Several filters were applied, all of which appeared to lower the noise from irrelevant hits. The main conclusions were that plagiarism software offers a unique opportunity to screen for plagiarism easily but also that it has to be employed with caution as automated or uncritical use is far too unreliable to allow a fair basis for judging the degree of plagiarism in a manuscript. This remains the job of senior editors. Whereas a few cases of self-plagiarism that would not likely have been accepted with today's guidelines were indeed identified, no cases of fraud or serious plagiarism were found. Potential guidelines are discussed. © 2016 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  6. Network Monitor and Control of Disruption-Tolerant Networks

    NASA Technical Reports Server (NTRS)

    Torgerson, J. Leigh

    2014-01-01

    For nearly a decade, NASA and many researchers in the international community have been developing Internet-like protocols that allow for automated network operations in networks where the individual links between nodes are only sporadically connected. A family of Disruption-Tolerant Networking (DTN) protocols has been developed, and many are reaching CCSDS Blue Book status. A NASA version of DTN known as the Interplanetary Overlay Network (ION) has been flight-tested on the EPOXI spacecraft and ION is currently being tested on the International Space Station. Experience has shown that in order for a DTN service-provider to set up a large scale multi-node network, a number of network monitor and control technologies need to be fielded as well as the basic DTN protocols. The NASA DTN program is developing a standardized means of querying a DTN node to ascertain its operational status, known as the DTN Management Protocol (DTNMP), and the program has developed some prototypes of DTNMP software. While DTNMP is a necessary component, it is not sufficient to accomplish Network Monitor and Control of a DTN network. JPL is developing a suite of tools that provide for network visualization, performance monitoring and ION node control software. This suite of network monitor and control tools complements the GSFC and APL-developed DTN MP software, and the combined package can form the basis for flight operations using DTN.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohe, Daniel Peter

    Sandia National Laboratories has recently purchased a Polytec 3D Scanning Laser Doppler Vibrometer for vibration measurement. This device has proven to be a very nice tool for making vibration measurements, and has a number of advantages over traditional sensors such as accelerometers. The non-contact nature of the laser vibrometer means there is no mass loading due to measuring the response. Additionally, the laser scanning heads can position the laser spot much more quickly and accurately than placing an accelerometer or performing a roving hammer impact. The disadvantage of the system is that a significant amount of time must be investedmore » to align the lasers with each other and the part so that the laser spots can be accurately positioned. The Polytec software includes a number of nice tools to aid in this procedure; however, certain portions are still tedious. Luckily, the Polytec software is readily extensible by programming macros for the system, so tedious portions of the procedure can be made easier by automating the process. The Polytec Software includes a WinWrap (similar to Visual Basic) editor and interface to run macros written in that programming language. The author, however, is much more proficient in Python, and the latter also has a much larger set of libraries that can be used to create very complex macros, while taking advantage of Python’s inherent readability and maintainability.« less

  8. Novel Analysis Software for Detecting and Classifying Ca2+ Transient Abnormalities in Stem Cell-Derived Cardiomyocytes

    PubMed Central

    Penttinen, Kirsi; Siirtola, Harri; Àvalos-Salguero, Jorge; Vainio, Tiina; Juhola, Martti; Aalto-Setälä, Katriina

    2015-01-01

    Comprehensive functioning of Ca2+ cycling is crucial for excitation–contraction coupling of cardiomyocytes (CMs). Abnormal Ca2+ cycling is linked to arrhythmogenesis, which is associated with cardiac disorders and heart failure. Accordingly, we have generated spontaneously beating CMs from induced pluripotent stem cells (iPSC) derived from patients with catecholaminergic polymorphic ventricular tachycardia (CPVT), which is an inherited and severe cardiac disease. Ca2+ cycling studies have revealed substantial abnormalities in these CMs. Ca2+ transient analysis performed manually lacks accepted analysis criteria, and has both low throughput and high variability. To overcome these issues, we have developed a software tool, AnomalyExplorer based on interactive visualization, to assist in the classification of Ca2+ transient patterns detected in CMs. Here, we demonstrate the usability and capability of the software, and we also compare the analysis efficiency to manual analysis. We show that AnomalyExplorer is suitable for detecting normal and abnormal Ca2+ transients; furthermore, this method provides more defined and consistent information regarding the Ca2+ abnormality patterns and cell line specific differences when compared to manual analysis. This tool will facilitate and speed up the analysis of CM Ca2+ transients, making it both more accurate and user-independent. AnomalyExplorer can be exploited in Ca2+ cycling analysis to study basic disease pathology and the effects of different drugs. PMID:26308621

  9. Novel Analysis Software for Detecting and Classifying Ca2+ Transient Abnormalities in Stem Cell-Derived Cardiomyocytes.

    PubMed

    Penttinen, Kirsi; Siirtola, Harri; Àvalos-Salguero, Jorge; Vainio, Tiina; Juhola, Martti; Aalto-Setälä, Katriina

    2015-01-01

    Comprehensive functioning of Ca2+ cycling is crucial for excitation-contraction coupling of cardiomyocytes (CMs). Abnormal Ca2+ cycling is linked to arrhythmogenesis, which is associated with cardiac disorders and heart failure. Accordingly, we have generated spontaneously beating CMs from induced pluripotent stem cells (iPSC) derived from patients with catecholaminergic polymorphic ventricular tachycardia (CPVT), which is an inherited and severe cardiac disease. Ca2+ cycling studies have revealed substantial abnormalities in these CMs. Ca2+ transient analysis performed manually lacks accepted analysis criteria, and has both low throughput and high variability. To overcome these issues, we have developed a software tool, AnomalyExplorer based on interactive visualization, to assist in the classification of Ca2+ transient patterns detected in CMs. Here, we demonstrate the usability and capability of the software, and we also compare the analysis efficiency to manual analysis. We show that AnomalyExplorer is suitable for detecting normal and abnormal Ca2+ transients; furthermore, this method provides more defined and consistent information regarding the Ca2+ abnormality patterns and cell line specific differences when compared to manual analysis. This tool will facilitate and speed up the analysis of CM Ca2+ transients, making it both more accurate and user-independent. AnomalyExplorer can be exploited in Ca2+ cycling analysis to study basic disease pathology and the effects of different drugs.

  10. Integrated Baseline System (IBS), Version 1.03. User guide: Chemical Stockpile Emergency Preparedness Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, B.M.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planing and analysis. This document is the user guide for the IBS and explains how to operate the IBS system. The fundamental function of the IBS is to provide tools that civilian emergency management personnel can use in developing emergency plans and in supporting emergency management activities to cope with a chemical-releasing event at a military chemical stockpile. Emergency management planners can evaluate concepts and ideas using the IBS system. The results of that experience can then be factoredmore » into refining requirements and plans. This document provides information for the general system user, and is the primary reference for the system features of the IBS. It is designed for persons who are familiar with general emergency management concepts, operations, and vocabulary. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other LBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary. IBS is a dynamic system. Its capabilities are in a state of continuing expansion and enhancement.« less

  11. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  12. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Astrophysics Data System (ADS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-11-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  13. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-01-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  14. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  15. A 2D Fourier tool for the analysis of photo-elastic effect in large granular assemblies

    NASA Astrophysics Data System (ADS)

    Leśniewska, Danuta

    2017-06-01

    Fourier transforms are the basic tool in constructing different types of image filters, mainly those reducing optical noise. Some DIC or PIV software also uses frequency space to obtain displacement fields from a series of digital images of a deforming body. The paper presents series of 2D Fourier transforms of photo-elastic transmission images, representing large pseudo 2D granular assembly, deforming under varying boundary conditions. The images related to different scales were acquired using the same image resolution, but taken at different distance from the sample. Fourier transforms of images, representing different stages of deformation, reveal characteristic features at the three (`macro-`, `meso-` and `micro-`) scales, which can serve as a data to study internal order-disorder transition within granular materials.

  16. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  17. OPMILL - MICRO COMPUTER PROGRAMMING ENVIRONMENT FOR CNC MILLING MACHINES THREE AXIS EQUATION PLOTTING CAPABILITIES

    NASA Technical Reports Server (NTRS)

    Ray, R. B.

    1994-01-01

    OPMILL is a computer operating system for a Kearney and Trecker milling machine that provides a fast and easy way to program machine part manufacture with an IBM compatible PC. The program gives the machinist an "equation plotter" feature which plots any set of equations that define axis moves (up to three axes simultaneously) and converts those equations to a machine milling program that will move a cutter along a defined path. Other supported functions include: drill with peck, bolt circle, tap, mill arc, quarter circle, circle, circle 2 pass, frame, frame 2 pass, rotary frame, pocket, loop and repeat, and copy blocks. The system includes a tool manager that can handle up to 25 tools and automatically adjusts tool length for each tool. It will display all tool information and stop the milling machine at the appropriate time. Information for the program is entered via a series of menus and compiled to the Kearney and Trecker format. The program can then be loaded into the milling machine, the tool path graphically displayed, and tool change information or the program in Kearney and Trecker format viewed. The program has a complete file handling utility that allows the user to load the program into memory from the hard disk, save the program to the disk with comments, view directories, merge a program on the disk with one in memory, save a portion of a program in memory, and change directories. OPMILL was developed on an IBM PS/2 running DOS 3.3 with 1 MB of RAM. OPMILL was written for an IBM PC or compatible 8088 or 80286 machine connected via an RS-232 port to a Kearney and Trecker Data Mill 700/C Control milling machine. It requires a "D:" drive (fixed-disk or virtual), a browse or text display utility, and an EGA or better display. Users wishing to modify and recompile the source code will also need Turbo BASIC, Turbo C, and Crescent Software's QuickPak for Turbo BASIC. IBM PC and IBM PS/2 are registered trademarks of International Business Machines. Turbo BASIC and Turbo C are trademarks of Borland International.

  18. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  19. Sam2bam: High-Performance Framework for NGS Data Preprocessing Tools

    PubMed Central

    Cheng, Yinhe; Tzeng, Tzy-Hwa Kathy

    2016-01-01

    This paper introduces a high-throughput software tool framework called sam2bam that enables users to significantly speed up pre-processing for next-generation sequencing data. The sam2bam is especially efficient on single-node multi-core large-memory systems. It can reduce the runtime of data pre-processing in marking duplicate reads on a single node system by 156–186x compared with de facto standard tools. The sam2bam consists of parallel software components that can fully utilize multiple processors, available memory, high-bandwidth storage, and hardware compression accelerators, if available. The sam2bam provides file format conversion between well-known genome file formats, from SAM to BAM, as a basic feature. Additional features such as analyzing, filtering, and converting input data are provided by using plug-in tools, e.g., duplicate marking, which can be attached to sam2bam at runtime. We demonstrated that sam2bam could significantly reduce the runtime of next generation sequencing (NGS) data pre-processing from about two hours to about one minute for a whole-exome data set on a 16-core single-node system using up to 130 GB of memory. The sam2bam could reduce the runtime of NGS data pre-processing from about 20 hours to about nine minutes for a whole-genome sequencing data set on the same system using up to 711 GB of memory. PMID:27861637

  20. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  1. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  2. XML schemas for common bioinformatic data types and their application in workflow systems.

    PubMed

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-11-06

    Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data--therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at http://bioschemas.sourceforge.net, the BioDOM library can be obtained at http://biodom.sourceforge.net. The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios.

  3. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  4. Integrating and Managing Bim in GIS, Software Review

    NASA Astrophysics Data System (ADS)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  5. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    PubMed

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  6. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  7. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  8. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    PubMed

    Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris

    2014-09-29

    The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.

  9. A Microsoft-Excel-based tool for running and critically appraising network meta-analyses—an overview and application of NetMetaXL

    PubMed Central

    2014-01-01

    Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416

  10. GOATS - Orbitology Component

    NASA Technical Reports Server (NTRS)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.

  11. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  12. Stereographic Projection Techniques for Geologists and Civil Engineers

    NASA Astrophysics Data System (ADS)

    Lisle, Richard J.; Leyshon, Peter R.

    2004-05-01

    An essential tool in the fields of structural geology and geotechnics, stereographic projection allows three-dimensional orientation data to be represented and manipulated. This revised edition presents a basic introduction to the subject with examples, illustrations and exercises that encourage the student to visualize the problems in three dimensions. It will provide students of geology, rock mechanics, and geotechnical and civil engineering with an indispensable guide to the analysis and interpretation of field orientation data. Links to useful web resources and software programs are also provided. First Edition published by Butterworth-Heinemann (1996): 0-750-62450-7

  13. Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger

    NASA Astrophysics Data System (ADS)

    Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang

    2017-12-01

    This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.

  14. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  15. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  16. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  17. Current and future trends in marine image annotation software

    NASA Astrophysics Data System (ADS)

    Gomes-Pereira, Jose Nuno; Auger, Vincent; Beisiegel, Kolja; Benjamin, Robert; Bergmann, Melanie; Bowden, David; Buhl-Mortensen, Pal; De Leo, Fabio C.; Dionísio, Gisela; Durden, Jennifer M.; Edwards, Luke; Friedman, Ariell; Greinert, Jens; Jacobsen-Stout, Nancy; Lerner, Steve; Leslie, Murray; Nattkemper, Tim W.; Sameoto, Jessica A.; Schoening, Timm; Schouten, Ronald; Seager, James; Singh, Hanumant; Soubigou, Olivier; Tojeira, Inês; van den Beld, Inge; Dias, Frederico; Tempera, Fernando; Santos, Ricardo S.

    2016-12-01

    Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.

  18. WinHPC System Software | High-Performance Computing | NREL

    Science.gov Websites

    Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry

  19. Software Reviews Since Acquisition Reform - The Artifact Perspective

    DTIC Science & Technology

    2004-01-01

    Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality

  20. Dental Informatics tool “SOFPRO” for the study of oral submucous fibrosis

    PubMed Central

    Erlewad, Dinesh Masajirao; Mundhe, Kalpana Anandrao; Hazarey, Vinay K

    2016-01-01

    Background: Dental informatics is an evolving branch widely used in dental education and practice. Numerous applications that support clinical care, education and research have been developed. However, very few such applications are developed and utilized in the epidemiological studies of oral submucous fibrosis (OSF) which is affecting a significant population of Asian countries. Aims and Objectives: To design and develop an user friendly software for the descriptive epidemiological study of OSF. Materials and Methods: With the help of a software engineer a computer program SOFPRO was designed and developed by using, Ms-Visual Basic 6.0 (VB), Ms-Access 2000, Crystal Report 7.0 and Ms-Paint in operating system XP. For the analysis purpose the available OSF data from the departmental precancer registry was fed into the SOFPRO. Results: Known data, not known and null data are successfully accepted in data entry and represented in data analysis of OSF. Smooth working of SOFPRO and its correct data flow was tested against real-time data of OSF. Conclusion: SOFPRO was found to be a user friendly automated tool for easy data collection, retrieval, management and analysis of OSF patients. PMID:27601808

  1. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  2. bioWeb3D: an online webGL 3D data visualisation tool

    PubMed Central

    2013-01-01

    Background Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. Results An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Conclusions Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets. PMID:23758781

  3. MIAQuant, a novel system for automatic segmentation, measurement, and localization comparison of different biomarkers from serialized histological slices.

    PubMed

    Casiraghi, Elena; Cossa, Mara; Huber, Veronica; Rivoltini, Licia; Tozzi, Matteo; Villa, Antonello; Vergani, Barbara

    2017-11-02

    In the clinical practice, automatic image analysis methods quickly quantizing histological results by objective and replicable methods are getting more and more necessary and widespread. Despite several commercial software products are available for this task, they are very little flexible, and provided as black boxes without modifiable source code. To overcome the aforementioned problems, we employed the commonly used MATLAB platform to develop an automatic method, MIAQuant, for the analysis of histochemical and immunohistochemical images, stained with various methods and acquired by different tools. It automatically extracts and quantifies markers characterized by various colors and shapes; furthermore, it aligns contiguous tissue slices stained by different markers and overlaps them with differing colors for visual comparison of their localization. Application of MIAQuant for clinical research fields, such as oncology and cardiovascular disease studies, has proven its efficacy, robustness and flexibility with respect to various problems; we highlight that, the flexibility of MIAQuant makes it an important tool to be exploited for basic researches where needs are constantly changing. MIAQuant software and its user manual are freely available for clinical studies, pathological research, and diagnosis.

  4. VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1994-01-01

    VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.

  5. Online, offline, realtime: recent developments in industrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Boesemann, Werner

    2003-01-01

    In recent years industrial photogrammetry has emerged from a highly specialized niche technology to a well established tool in industrial coordinate measurement applications with numerous installations in a significantly growing market of flexible and portable optical measurement systems. This is due to the development of powerful, but affordable video and computer technology. The increasing industrial requirements for accuracy, speed, robustness and ease of use of these systems together with a demand for the highest possible degree of automation have forced universities and system manufacturer to develop hard- and software solutions to meet these requirements. The presentation will show the latest trends in hardware development, especially new generation digital and/or intelligent cameras, aspects of image engineering like use of controlled illumination or projection technologies, and algorithmic and software aspects like automation strategies or new camera models. The basic qualities of digital photogrammetry- like portability and flexibility on one hand and fully automated quality control on the other - sometimes lead to certain conflicts in the design of measurement systems for different online, offline, or real-time solutions. The presentation will further show, how these tools and methods are combined in different configurations to be able to cover the still growing demands of the industrial end-users.

  6. Photogrammetry in the line: recent developments in industrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Boesemann, Werner

    2003-05-01

    In recent years industrial photogrammetry has emerged from a highly specialized niche technology to a well established tool in industrial coordinate measurement applications with numerous installations in a significantly growing market of flexible and portable optical measurement systems. This is due to the development of powerful, but affordable video and computer technology. The increasing industrial requirements for accuracy, speed, robustness and ease of use of these systems together with a demand for the highest possible degree of automation have forced universities and system manufacturers to develop hard- and software solutions to meet these requirements. The presentation will show the latest trends in hardware development, especially new generation digital and/or intelligent cameras, aspects of image engineering like use of controlled illumination or projection technologies,and algorithmic and software aspects like automation strategies or new camera models. The basic qualities of digital photogrammetry-like portability and flexibility on one hand and fully automated quality control on the other -- sometimes lead to certain conflicts in the design of measurement systems for different online, offline or real-time solutions. The presentation will further show, how these tools and methods are combined in different configurations to be able to cover the still growing demands of the industrial end-users.

  7. Computer-Aided Sensor Development Focused on Security Issues.

    PubMed

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  8. Computer-Aided Sensor Development Focused on Security Issues

    PubMed Central

    Bialas, Andrzej

    2016-01-01

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360

  9. DiSCaMB: a software library for aspherical atom model X-ray scattering factor calculations with CPUs and GPUs.

    PubMed

    Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria

    2018-02-01

    It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.

  10. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  11. Modular design of synthetic gene circuits with biological parts and pools.

    PubMed

    Marchisio, Mario Andrea

    2015-01-01

    Synthetic gene circuits can be designed in an electronic fashion by displaying their basic components-Standard Biological Parts and Pools of molecules-on the computer screen and connecting them with hypothetical wires. This procedure, achieved by our add-on for the software ProMoT, was successfully applied to bacterial circuits. Recently, we have extended this design-methodology to eukaryotic cells. Here, highly complex components such as promoters and Pools of mRNA contain hundreds of species and reactions whose calculation demands a rule-based modeling approach. We showed how to build such complex modules via the joint employment of the software BioNetGen (rule-based modeling) and ProMoT (modularization). In this chapter, we illustrate how to utilize our computational tool for synthetic biology with the in silico implementation of a simple eukaryotic gene circuit that performs the logic AND operation.

  12. Design of Instrument Control Software for Solar Vector Magnetograph at Udaipur Solar Observatory

    NASA Astrophysics Data System (ADS)

    Gosain, Sanjay; Venkatakrishnan, P.; Venugopalan, K.

    2004-04-01

    A magnetograph is an instrument which makes measurement of solar magnetic field by measuring Zeeman induced polarization in solar spectral lines. In a typical filter based magnetograph there are three main modules namely, polarimeter, narrow-band spectrometer (filter), and imager(CCD camera). For a successful operation of magnetograph it is essential that these modules work in synchronization with each other. Here, we describe the design of instrument control system implemented for the Solar Vector Magnetograph under development at Udaipur Solar Observatory. The control software is written in Visual Basic and exploits the Component Object Model (COM) components for a fast and flexible application development. The user can interact with the instrument modules through a Graphical User Interface (GUI) and can program the sequence of magnetograph operations. The integration of Interactive Data Language (IDL) ActiveX components in the interface provides a powerful tool for online visualization, analysis and processing of images.

  13. Installing and Setting Up the Git Software Tool on OS X | High-Performance

    Science.gov Websites

    Computing | NREL the Git Software Tool on OS X Installing and Setting Up the Git Software Tool on OS X Learn how to install the Git software tool on OS X for use with the Peregrine system. You can . Binary Installer for OS X - Easiest! You can download the latest version of git from http://git-scm.com

  14. Spacecraft Avionics Software Development Then and Now: Different but the Same

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Garman, John (Jack); Vice, Jason

    2012-01-01

    NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA s historic Software Production Facility (SPF) was developed to serve complex avionics software solutions during an era dominated by mainframes, tape drives, and lower level programming languages. These systems have proven themselves resilient enough to serve the Shuttle Orbiter Avionics life cycle for decades. The SPF and its predecessor the Software Development Lab (SDL) at NASA s Johnson Space Center (JSC) hosted flight software (FSW) engineering, development, simulation, and test. It was active from the beginning of Shuttle Orbiter development in 1972 through the end of the shuttle program in the summer of 2011 almost 40 years. NASA s Kedalion engineering analysis lab is on the forefront of validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA s heritage culture in avionics software engineering. Kedalion has validated many of the Orion project s HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics environment, inserting new techniques and skills into the Multi-Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, COTS products, early rapid prototyping, in-house expertise and tools, and customer collaboration, NASA has adopted a cost effective paradigm that is currently serving Orion effectively. This paper will explore and contrast differences in technology employed over the years of NASA s space program, due largely to technological advances in hardware and software systems, while acknowledging that the basic software engineering and integration paradigms share many similarities.

  15. MoRFchibi SYSTEM: software tools for the identification of MoRFs in protein sequences.

    PubMed

    Malhis, Nawar; Jacobson, Matthew; Gsponer, Jörg

    2016-07-08

    Molecular recognition features, MoRFs, are short segments within longer disordered protein regions that bind to globular protein domains in a process known as disorder-to-order transition. MoRFs have been found to play a significant role in signaling and regulatory processes in cells. High-confidence computational identification of MoRFs remains an important challenge. In this work, we introduce MoRFchibi SYSTEM that contains three MoRF predictors: MoRFCHiBi, a basic predictor best suited as a component in other applications, MoRFCHiBi_ Light, ideal for high-throughput predictions and MoRFCHiBi_ Web, slower than the other two but best for high accuracy predictions. Results show that MoRFchibi SYSTEM provides more than double the precision of other predictors. MoRFchibi SYSTEM is available in three different forms: as HTML web server, RESTful web server and downloadable software at: http://www.chibi.ubc.ca/faculty/joerg-gsponer/gsponer-lab/software/morf_chibi/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. ICW eHealth Framework.

    PubMed

    Klein, Karsten; Wolff, Astrid C; Ziebold, Oliver; Liebscher, Thomas

    2008-01-01

    The ICW eHealth Framework (eHF) is a powerful infrastructure and platform for the development of service-oriented solutions in the health care business. It is the culmination of many years of experience of ICW in the development and use of in-house health care solutions and represents the foundation of ICW product developments based on the Java Enterprise Edition (Java EE). The ICW eHealth Framework has been leveraged to allow development by external partners - enabling adopters a straightforward integration into ICW solutions. The ICW eHealth Framework consists of reusable software components, development tools, architectural guidelines and conventions defining a full software-development and product lifecycle. From the perspective of a partner, the framework provides services and infrastructure capabilities for integrating applications within an eHF-based solution. This article introduces the ICW eHealth Framework's basic architectural concepts and technologies. It provides an overview of its module and component model, describes the development platform that supports the complete software development lifecycle of health care applications and outlines technological aspects, mainly focusing on application development frameworks and open standards.

  17. TRAC, a collaborative computer tool for tracer-test interpretation

    NASA Astrophysics Data System (ADS)

    Gutierrez, A.; Klinka, T.; Thiéry, D.; Buscarlet, E.; Binet, S.; Jozja, N.; Défarge, C.; Leclerc, B.; Fécamp, C.; Ahumada, Y.; Elsass, J.

    2013-05-01

    Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being). Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr">http://trac.brgm.fr.

  18. NCAR Earth Observing Laboratory's Data Tracking System

    NASA Astrophysics Data System (ADS)

    Cully, L. E.; Williams, S. F.

    2014-12-01

    The NCAR Earth Observing Laboratory (EOL) maintains an extensive collection of complex, multi-disciplinary datasets from national and international, current and historical projects accessible through field project web pages (https://www.eol.ucar.edu/all-field-projects-and-deployments). Data orders are processed through the EOL Metadata Database and Cyberinfrastructure (EMDAC) system. Behind the scenes is the institutionally created EOL Computing, Data, and Software/Data Management Group (CDS/DMG) Data Tracking System (DTS) tool. The DTS is used to track the complete life cycle (from ingest to long term stewardship) of the data, metadata, and provenance for hundreds of projects and thousands of data sets. The DTS is an EOL internal only tool which consists of three subsystems: Data Loading Notes (DLN), Processing Inventory Tool (IVEN), and Project Metrics (STATS). The DLN is used to track and maintain every dataset that comes to the CDS/DMG. The DLN captures general information such as title, physical locations, responsible parties, high level issues, and correspondence. When the CDS/DMG processes a data set, IVEN is used to track the processing status while collecting sufficient information to ensure reproducibility. This includes detailed "How To" documentation, processing software (with direct links to the EOL Subversion software repository), and descriptions of issues and resolutions. The STATS subsystem generates current project metrics such as archive size, data set order counts, "Top 10" most ordered data sets, and general information on who has ordered these data. The DTS was developed over many years to meet the specific needs of the CDS/DMG, and it has been successfully used to coordinate field project data management efforts for the past 15 years. This paper will describe the EOL CDS/DMG Data Tracking System including its basic functionality, the provenance maintained within the system, lessons learned, potential improvements, and future developments.

  19. Compositional Effects on Nickel-Base Superalloy Single Crystal Microstructures

    NASA Technical Reports Server (NTRS)

    MacKay, Rebecca A.; Gabb, Timothy P.; Garg,Anita; Rogers, Richard B.; Nathal, Michael V.

    2012-01-01

    Fourteen nickel-base superalloy single crystals containing 0 to 5 wt% chromium (Cr), 0 to 11 wt% cobalt (Co), 6 to 12 wt% molybdenum (Mo), 0 to 4 wt% rhenium (Re), and fixed amounts of aluminum (Al) and tantalum (Ta) were examined to determine the effect of bulk composition on basic microstructural parameters, including gamma' solvus, gamma' volume fraction, volume fraction of topologically close-packed (TCP) phases, phase chemistries, and gamma - gamma'. lattice mismatch. Regression models were developed to describe the influence of bulk alloy composition on the microstructural parameters and were compared to predictions by a commercially available software tool that used computational thermodynamics. Co produced the largest change in gamma' solvus over the wide compositional range used in this study, and Mo produced the largest effect on the gamma lattice parameter and the gamma - gamma' lattice mismatch over its compositional range, although Re had a very potent influence on all microstructural parameters investigated. Changing the Cr, Co, Mo, and Re contents in the bulk alloy had a significant impact on their concentrations in the gamma matrix and, to a smaller extent, in the gamma' phase. The gamma phase chemistries exhibited strong temperature dependencies that were influenced by the gamma and gamma' volume fractions. A computational thermodynamic modeling tool significantly underpredicted gamma' solvus temperatures and grossly overpredicted the amount of TCP phase at 982 C. Furthermore, the predictions by the software tool for the gamma - gamma' lattice mismatch were typically of the wrong sign and magnitude, but predictions could be improved if TCP formation was suspended within the software program. However, the statistical regression models provided excellent estimations of the microstructural parameters based on bulk alloy composition, thereby demonstrating their usefulness.

  20. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  1. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    PubMed

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  3. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  4. Low-cost diffuse optical tomography for the classroom

    NASA Astrophysics Data System (ADS)

    Minagawa, Taisuke; Zirak, Peyman; Weigel, Udo M.; Kristoffersen, Anna K.; Mateos, Nicolas; Valencia, Alejandra; Durduran, Turgut

    2012-10-01

    Diffuse optical tomography (DOT) is an emerging imaging modality with potential applications in oncology, neurology, and other clinical areas. It allows the non-invasive probing of the tissue function using relatively inexpensive and safe instrumentation. An educational laboratory setup of a DOT system could be used to demonstrate how photons propagate through tissues, basics of medical tomography, and the concepts of multiple scattering and absorption. Here, we report a DOT setup that could be introduced to the advanced undergraduate or early graduate curriculum using inexpensive and readily available tools. The basis of the system is the LEGO Mindstorms NXT platform which controls the light sources, the detectors (photo-diodes), a mechanical 2D scanning platform, and the data acquisition. A basic tomographic reconstruction is implemented in standard numerical software, and 3D images are reconstructed. The concept was tested and developed in an educational environment that involved a high-school student and a group of post-doctoral fellows.

  5. Basic Radar Altimetry Toolbox and Radar Altimetry Tutorial: Tools for all Altimetry Users

    NASA Astrophysics Data System (ADS)

    Rosmorduc, Vinca; Benveniste, J.; Breebaart, L.; Bronner, E.; Dinardo, S.; Earith, D.; Lucas, B. M.; Maheu, C.; Niejmeier, S.; Picot, N.

    2013-09-01

    The Basic Radar Altimetry Toolbox is an "all- altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, Saral.It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. Nearly 2000 people downloaded it (January 2012), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 to 4. Others are under development, some are in discussion for the future.The Basic Radar Altimetry Toolbox is able:- to read most distributed radar altimetry data, including the one from future missions like Saral, Jason-3- to perform some processing, data editing and statistic, - and to visualize the results.It can be used at several levels/several ways, including as an educational tool, with the graphical user interface.As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data.BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (as of end of September 2012), with many "newcomers" to altimetry among them, and teachers/students. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are envisioned, some are in discussion.

  6. The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2010-01-01

    The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).

  7. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  8. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  9. A Decision Support System for Planning, Control and Auditing of DoD Software Cost Estimation.

    DTIC Science & Technology

    1986-03-01

    is frequently used in U. S. Air Force software cost estimates. Barry Boehm’s Constructive Cost Estimation Model (COCOMO) was recently selected for use...are considered basic to the proper development of software. Pressman , [Ref. 11], addresses these basic elements in a manner which attempts to integrate...H., Jr., and Carlson, Eric D., Building E fective Decision SUDDOrt Systems, Prentice-Hal, EnglewoodNJ, 1982 11. Pressman , Roger S., o A Practioner’s A

  10. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  11. MFV-class: a multi-faceted visualization tool of object classes.

    PubMed

    Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting

    2004-11-01

    Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.

  12. VOLCWORKS: A suite for optimization of hazards mapping

    NASA Astrophysics Data System (ADS)

    Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.

    2012-04-01

    Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the suite VOLCWORKS, whose principle is to have a flexible-implementation architecture allowing rapid development of software to the extent specified by the needs including calculations, routines, or algorithms, both new and through redesign of available software in the volcanological community, but especially allowing to include new knowledge, models or software transferring them to software modules. The design is component-oriented platform, which allows incorporating particular solutions (routines, simulations, etc.), which can be concatenated for integration or highlighting information. The platform includes a graphical interface with capabilities for working in different visual environments that can be focused to the particular work of different types of users (researchers, lecturers, students, etc.). This platform aims to integrate simulation and visualization phases, incorporating proven tools (now isolated). VOLCWORKS can be used under different operating systems (Windows, Linux and Mac OS) and fit the context of use automatically and at runtime: in both tasks and their sequence, such as utilization of hardware resources (CPU, GPU, special monitors, etc.). The application has the ability to run on a laptop or even in a virtual reality room with access to supercomputers.

  13. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  14. Implementation of Simple and Functional Web Applications at the Alaska Volcano Observatory Remote Sensing Group

    NASA Astrophysics Data System (ADS)

    Skoog, R. A.

    2007-12-01

    Web pages are ubiquitous and accessible, but when compared to stand-alone applications they are limited in capability. The Alaska Volcano Observatory (AVO) Remote Sensing Group has implemented web pages and supporting server software that provide relatively advanced features to any user able to meet basic requirements. Anyone in the world with access to a modern web browser (such as Mozilla Firefox 1.5 or Internet Explorer 6) and reasonable internet connection can fully use the tools, with no software installation or configuration. This allows faculty, staff and students at AVO to perform many aspects of volcano monitoring from home or the road as easily as from the office. Additionally, AVO collaborators such as the National Weather Service and the Anchorage Volcanic Ash Advisory Center are able to use these web tools to quickly assess volcanic events. Capabilities of this web software include (1) ability to obtain accurate measured remote sensing data values on an semi- quantitative compressed image of a large area, (2) to view any data from a wide time range of data swaths, (3) to view many different satellite remote sensing spectral bands and combinations, to adjust color range thresholds, (4) and to export to KML files which are viewable virtual globes such as Google Earth. The technologies behind this implementation are primarily Javascript, PHP, and MySQL which are free to use and well documented, in addition to Terascan, a commercial software package used to extract data from level-0 data files. These technologies will be presented in conjunction with the techniques used to combine them into the final product used by AVO and its collaborators for operational volcanic monitoring.

  15. A study about teaching quadratic functions using mathematical models and free software

    NASA Astrophysics Data System (ADS)

    Nepomucena, T. V.; da Silva, A. C.; Jardim, D. F.; da Silva, J. M.

    2017-12-01

    In the face of the reality of teaching Mathematics in Basic Education in Brazil, specially relating teach functions focusing their relevance to the student’s academic development in Basic and Superior Education, this work proposes the use of educational software to help the teaching of functions in Basic Education since the computers and software show as an outstanding option to help the teaching and learning processes. On the other hand, the study also proposes the use of Didactic Transposition as a methodology investigation and research. Along with this survey, some teaching interventions were applied to detect the main difficulties in the teaching process of functions in the Basic Education, analyzing the results obtained along the interventions in a qualitative form. Considering the discussion of the results at the end of the didactic interventions, it was verified that the results obtained were satisfactory.

  16. Using a software-defined computer in teaching the basics of computer architecture and operation

    NASA Astrophysics Data System (ADS)

    Kosowska, Julia; Mazur, Grzegorz

    2017-08-01

    The paper describes the concept and implementation of SDC_One software-defined computer designed for experimental and didactic purposes. Equipped with extensive hardware monitoring mechanisms, the device enables the students to monitor the computer's operation on bus transfer cycle or instruction cycle basis, providing the practical illustration of basic aspects of computer's operation. In the paper, we describe the hardware monitoring capabilities of SDC_One and some scenarios of using it in teaching the basics of computer architecture and microprocessor operation.

  17. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  18. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  19. PC Software for Artificial Intelligence Applications.

    PubMed

    Epp, H; Kalin, M; Miller, D

    1988-05-06

    Our review has emphasized that AI tools are programming languages inspired by some problem-solving paradigm. We want to underscore their status as programming languages; even if an AI tool seems to fit a problem perfectly, its proficient use still requires the training and practice associated with any programming language. The programming manuals for PC-Plus, Smalltalk/ V, and Nexpert Object are all tutorial in nature, and the corresponding software packages come with sample applications. We find the manuals to be uniformly good introductions that try to anticipate the problems of a user who is new to the technology. All three vendors offer free technical support by telephone to licensed users. AI tools are sometimes oversold as a way to make programming easy or to avoid it altogether. The truth is that AI tools demand programming-but programming that allows you to concentrate on the essentials of the problem. If we had to implement a diagnostic system, we would look first to a product such as PC-Plus rather than BASIC or C, because PC-Plus is designed specifically for such a problem, whereas these conventional languages are not. If we had to implement a system that required graphical interfaces and could benefit from inheritance, we would look first to an object-oriented system such as Smalltalk/V that provides built-in mechanisms for both. If we had to implement an expert system that called for some mix of AI and conventional techniques, we would look first to a product such as Nexpert Object that integrates various problem-solving technologies. Finally, we might use FORTRAN if we were concerned primarily with programming a well-defined numerical algorithm. AI tools are a valuable complement to traditional languages.

  20. Bringing your tools to CyVerse Discovery Environment using Docker

    PubMed Central

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802

  1. Bringing your tools to CyVerse Discovery Environment using Docker.

    PubMed

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  2. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    NASA Astrophysics Data System (ADS)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  3. Tools and Strategies for Product Life Cycle Management ñ A Case Study in Foundry

    NASA Astrophysics Data System (ADS)

    Patil, Rajashekar; Kumar, S. Mohan; Abhilash, E.

    2012-08-01

    Advances in information and communication technology (ICT) have opened new possibilities of collaborations among the customers, suppliers, manufactures and partners to effectively tackle various business challenges. Product Life Cycle Management(PLM) has been a proven approach for Original Equipment Manufacturers (OEMs) to increase their productivity, improve their product quality, speed up delivery, and increase their profit and to become more efficient. However, their Tier 2 and Tier 3 suppliers like foundry industries are still in their infancy without adopting PLM. Hence to enhance their understanding, the basic concepts, the tools and strategies for PLM are presented is this paper. By selecting and implementing appropriate PLM strategies in a small foundry, an attempt was also made to understand the immediate benefits of using PLM tools (commercial PLM software and digital manufacturing tools). This study indicated a reduction in lead time and improved utilization of organizational resources in the production of automobile impeller. These observations may be further extrapolated to other multiproduct, multi-discipline and multi-customer companies to realize the advantages of using PLM technology

  4. Proceedings of the Ninth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  5. The ALMA Common Software as a Basis for a Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Raffi, Gianni; Chiozzi, Gianluca; Glendenning, Brian

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe, North America and Japan. ALMA will consist of 64 12-m antennas operating in the millimetre and sub-millimetre wavelength range, with baselines of more than 10 km. It will be located at an altitude above 5000 m in the Chilean Atacama desert. The ALMA Computing group is a joint group with staff scattered on 3 continents and is responsible for all the control and data flow software related to ALMA, including tools ranging from support of proposal preparation to archive access of automatically created images. Early in the project it was decided that an ALMA Common Software (ACS) would be developed as a way to provide to all partners involved in the development a common software platform. The original assumption was that some key middleware like communication via CORBA and the use of XML and Java would be part of the project. It was intended from the beginning to develop this software in an incremental way based on releases, so that it would then evolve into an essential embedded part of all ALMA software applications. In this way we would build a basic unity and coherence into a system that will have been developed in a distributed fashion. This paper evaluates our progress after 1.5 year of work, following a few tests and preliminary releases. It analyzes the advantages and difficulties of such an ambitious approach, which creates an interface across all the various control and data flow applications.

  6. Software Management Environment (SME): Components and algorithms

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  7. caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oster, S.; Langella, S.; Hastings, S.

    To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: .« less

  8. GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource

    NASA Astrophysics Data System (ADS)

    Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.

    2015-12-01

    Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for complex, and continually improving, information integration and data extraction applications. We have developed one such application, which we present as an example, and invite new collaborations to develop other such applications.

  9. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  10. Reviews of Instructional Software in Scholarly Journals: A Selected Bibliography.

    ERIC Educational Resources Information Center

    Bantz, David A.; And Others

    This bibliography lists reviews of more than 100 instructional software packages, which are arranged alphabetically by discipline. Information provided for each entry includes the topical emphasis, type of software (i.e., simulation, tutorial, analysis tool, test generator, database, writing tool, drill, plotting tool, videodisc), the journal…

  11. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  12. Assistive Software Tools for Secondary-Level Students with Literacy Difficulties

    ERIC Educational Resources Information Center

    Lange, Alissa A.; McPhillips, Martin; Mulhern, Gerry; Wylie, Judith

    2006-01-01

    The present study assessed the compensatory effectiveness of four assistive software tools (speech synthesis, spellchecker, homophone tool, and dictionary) on literacy. Secondary-level students (N = 93) with reading difficulties completed computer-based tests of literacy skills. Training on their respective software followed for those assigned to…

  13. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  14. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  15. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  16. Study of a unified hardware and software fault-tolerant architecture

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan; Alger, Linda; Friend, Steven; Greeley, Gregory; Sacco, Stephen; Adams, Stuart

    1989-01-01

    A unified architectural concept, called the Fault Tolerant Processor Attached Processor (FTP-AP), that can tolerate hardware as well as software faults is proposed for applications requiring ultrareliable computation capability. An emulation of the FTP-AP architecture, consisting of a breadboard Motorola 68010-based quadruply redundant Fault Tolerant Processor, four VAX 750s as attached processors, and four versions of a transport aircraft yaw damper control law, is used as a testbed in the AIRLAB to examine a number of critical issues. Solutions of several basic problems associated with N-Version software are proposed and implemented on the testbed. This includes a confidence voter to resolve coincident errors in N-Version software. A reliability model of N-Version software that is based upon the recent understanding of software failure mechanisms is also developed. The basic FTP-AP architectural concept appears suitable for hosting N-Version application software while at the same time tolerating hardware failures. Architectural enhancements for greater efficiency, software reliability modeling, and N-Version issues that merit further research are identified.

  17. Evaluate the Usability of the Mobile Instant Messaging Software in the Elderly.

    PubMed

    Wen, Tzu-Ning; Cheng, Po-Liang; Chang, Po-Lun

    2017-01-01

    Instant messaging (IM) is one kind of online chat that provides real-time text transmission over the Internet. It becomes one of the popular communication tools. Even it is currnetly an era of smartphones, it still a great challenge to teach and promote the elderly to use smart phone. Besides, the acceptance of the elderly to use IM remains unknown. This study describes the usability and evaluates the acceptance of the IM in the elderly, who use the smartphone for the first time. This study is a quasi-experimental design study. The study period started from October, 2012 to December, 2013. There were totally 41 elderly recruited in the study. All of them were the first time to use LINE app on the smartphones. The usability was evaluated by using the Technology Acceptance Model which consisted of four constructs: cognitive usability, cognitive ease of use, attitude and willingness to use. Overall, the elderly had the best "attitude" for LINE APP communication software, with the highest rating averaging 4.07 points on four constructs, followed by an average of 4 points on "cognitive usefulness". The socres of "cognitive ease of use" and "willingness to use" scores were equal which are an average score of 3.86. It can be interpreted that (1) the elders thought that the LINE APP as an excellent communication tool for them; (2) they found the software is useful (3) it was convenient for them to communicate. However, it was necessary to additionally assist and explain the certain functions such as the options. It would play a great role in the "willingness to use". The positive acceptance of LINE APP in elderly refer to the probable similar acceptance for them to use other communication software. Encouraging the willingness the elderly to explore more technology products and understanding their behavior will be the basic knowledge to develop further software.

  18. Dataflow Design Tool: User's Manual

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1996-01-01

    The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.

  19. Rocket Science 101 Interactive Educational Program

    NASA Technical Reports Server (NTRS)

    Armstrong, Dennis; Funkhouse, Deborah; DiMarzio, Donald

    2007-01-01

    To better educate the public on the basic design of NASA s current mission rockets, Rocket Science 101 software has been developed as an interactive program designed to retain a user s attention and to teach about basic rocket parts. This program also has helped to expand NASA's presence on the Web regarding educating the public about the Agency s goals and accomplishments. The software was designed using Macromedia s Flash 8. It allows the user to select which type of rocket they want to learn about, interact with the basic parts, assemble the parts to create the whole rocket, and then review the basic flight profile of the rocket they have built.

  20. MO-E-BRD-01: Adapt-A-Thon - Texas Hold’em Invitational

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kessler, M; Brock, K; Pouliot, J

    2014-06-15

    Software tools for image-based adaptive radiotherapy such as deformable image registration, contour propagation and dose mapping have progressed beyond the research setting and are now commercial products available as part of both treatment planning systems and stand-alone applications. These software tools are used together to create clinical workflows to detect, track and evaluate changes in the patient and to accumulate dose. Deviations uncovered in this process are used to guide decisions about replanning/adaptation with the goal of keeping the delivery of prescribed dose “on target” throughout the entire course of radiotherapy. Since the output from one step of the adaptivemore » process is used as an input for another, it is essential to understand and document the uncertainty associated with each of the step and how these uncertainties are propagated. This in turn requires an understanding how the underlying tools work. Unfortunately, important details about the algorithms used to implement these tools are scarce or incomplete, too often for competitive reasons. This is in contrast to the situation involving other basic treatment planning algorithms such as dose calculations, where the medical physics community essentially requires vendors to provide physically important details about their underlying theory and clinical implementation. Vendors should adopt this same level of information sharing when it comes to the tools and techniques for image guided adaptive radiotherapy. The goal of this session is to start this process by inviting vendors and medical physicists to discuss and demonstrate the available tools and describe how they are intended to be used in clinical practice. The format of the session will involve a combination of formal presentations, interactive demonstrations, audience participation and some friendly “Texas style” competition. Learning Objectives: Understand the components of the image-based adaptive radiotherapy process. Understand the how these components are implemented in various commercial systems. Understand the different use cases and workflows currently supported these tools.« less

  1. Virtual tool mark generation for efficient striation analysis in forensic science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekstrand, Laura

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served tomore » con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles. Preliminary results are quite promising. In a study with both sides of 6 screwdriver tips and 34 corresponding marks, the method distinguished known matches from known non-matches with zero false positive matches and only two matches mistaken for non-matches. For matches, it could predict the correct marking angle within 5-10 . Moreover, on a standard desktop computer, the virtual marking software is capable of cleaning 3D tip and plate scans in minutes and producing a virtual mark and comparing it to a real mark in seconds. These results support several of the professional conclusions of the tool mark analysis com- munity, including the idea that marks produced by the same tool only match if they are made at similar angles. The method also displays the potential to automate part of the comparison process, freeing the examiner to focus on other tasks, which is important in busy, backlogged crime labs. Finally, the method o ers the unique chance to directly link an evidence mark to the tool that produced it while reducing potential damage to the evidence.« less

  2. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.

  3. Research pressure instrumentation for NASA Space Shuttle main engine, modification no. 5

    NASA Technical Reports Server (NTRS)

    Anderson, P. J.; Nussbaum, P.; Gustafson, G.

    1984-01-01

    The objective of the research project described is to define and demonstrate methods to advance the state of the art of pressure sensors for the space shuttle main engine (SSME). Silicon piezoresistive technology was utilized in completing tasks: generation and testing of three transducer design concepts for solid state applications; silicon resistor characterization at cryogenic temperatures; experimental chip mounting characterization; frequency response optimization and prototype design and fabrication. Excellent silicon sensor performance was demonstrated at liquid nitrogen temperature. A silicon resistor ion implant dose was customized for SSME temperature requirements. A basic acoustic modeling software program was developed as a design tool to evaluate frequency response characteristics.

  4. Petri nets as a modeling tool for discrete concurrent tasks of the human operator. [describing sequential and parallel demands on human operators

    NASA Technical Reports Server (NTRS)

    Schumacher, W.; Geiser, G.

    1978-01-01

    The basic concepts of Petri nets are reviewed as well as their application as the fundamental model of technical systems with concurrent discrete events such as hardware systems and software models of computers. The use of Petri nets is proposed for modeling the human operator dealing with concurrent discrete tasks. Their properties useful in modeling the human operator are discussed and practical examples are given. By means of and experimental investigation of binary concurrent tasks which are presented in a serial manner, the representation of human behavior by Petri nets is demonstrated.

  5. Virtual Reality Training System for Anytime/Anywhere Acquisition of Surgical Skills: A Pilot Study.

    PubMed

    Zahiri, Mohsen; Booton, Ryan; Nelson, Carl A; Oleynikov, Dmitry; Siu, Ka-Chun

    2018-03-01

    This article presents a hardware/software simulation environment suitable for anytime/anywhere surgical skills training. It blends the advantages of physical hardware and task analogs with the flexibility of virtual environments. This is further enhanced by a web-based implementation of training feedback accessible to both trainees and trainers. Our training system provides a self-paced and interactive means to attain proficiency in basic tasks that could potentially be applied across a spectrum of trainees from first responder field medical personnel to physicians. This results in a powerful training tool for surgical skills acquisition relevant to helping injured warfighters.

  6. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    NASA Astrophysics Data System (ADS)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  7. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  8. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    ERIC Educational Resources Information Center

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  9. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  10. Designing for User Cognition and Affect in Software Instructions

    ERIC Educational Resources Information Center

    van der Meij, Hans

    2008-01-01

    In this paper we examine how to design software instructions for user cognition and affect. A basic and co-user manual are compared. The first provides fundamental support for both; the latter includes a buddy to further optimize support for user affect. The basic manual was faster and judged as easier to process than the co-user manual. In…

  11. User Studies: Developing Learning Strategy Tool Software for Children.

    ERIC Educational Resources Information Center

    Fitzgerald, Gail E.; Koury, Kevin A.; Peng, Hsinyi

    This paper is a report of user studies for developing learning strategy tool software for children. The prototype software demonstrated is designed for children with learning and behavioral disabilities. The tools consist of easy-to-use templates for creating organizational, memory, and learning approach guides for use in classrooms and at home.…

  12. MUST - An integrated system of support tools for research flight software engineering. [Multipurpose User-oriented Software Technology

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.

  13. Utah's Regional/Urban ANSS Seismic Network---Strategies and Tools for Quality Performance

    NASA Astrophysics Data System (ADS)

    Burlacu, R.; Arabasz, W. J.; Pankow, K. L.; Pechmann, J. C.; Drobeck, D. L.; Moeinvaziri, A.; Roberson, P. M.; Rusho, J. A.

    2007-05-01

    The University of Utah's regional/urban seismic network (224 stations recorded: 39 broadband, 87 strong-motion, 98 short-period) has become a model for locally implementing the Advanced National Seismic System (ANSS) because of successes in integrating weak- and strong-motion recording and in developing an effective real-time earthquake information system. Early achievements included implementing ShakeMap, ShakeCast, point-to- multipoint digital telemetry, and an Earthworm Oracle database, as well as in-situ calibration of all broadband and strong-motion stations and submission of all data and metadata into the IRIS DMC. Regarding quality performance, our experience as a medium-size regional network affirms the fundamental importance of basics such as the following: for data acquisition, deliberate attention to high-quality field installations, signal quality, and computer operations; for operational efficiency, a consistent focus on professional project management and human resources; and for customer service, healthy partnerships---including constant interactions with emergency managers, engineers, public policy-makers, and other stakeholders as part of an effective state earthquake program. (Operational cost efficiencies almost invariably involve trade-offs between personnel costs and the quality of hardware and software.) Software tools that we currently rely on for quality performance include those developed by UUSS (e.g., SAC and shell scripts for estimating local magnitudes) and software developed by other organizations such as: USGS (Earthworm), University of Washington (interactive analysis software), ISTI (SeisNetWatch), and IRIS (PDCC, BUD tools). Although there are many pieces, there is little integration. One of the main challenges we face is the availability of a complete and coherent set of tools for automatic and post-processing to assist in achieving the goals/requirements set forth by ANSS. Taking our own network---and ANSS---to the next level will require standardized, well-designed, and supported software. Other advances in seismic network performance will come from diversified instrumentation. We have recently shown the utility of incorporating strong-motion data (even from soil sites) into the routine analysis of local seismicity, and have also collocated an acoustic array with a broadband seismic station (in collaboration with Southern Methodist University). For the latter experiment, the purpose of collocated seismic and infrasound sensors is to (1) further an understanding of the physics associated with the generation and the propagation of seismic and low-frequency acoustic energy from shallow sources and (2) explore the potential for blast discrimination and improved source location using seismic and infrasonic data in a synergetic way.

  14. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  15. Tools for Administration of a UNIX-Based Network

    NASA Technical Reports Server (NTRS)

    LeClaire, Stephen; Farrar, Edward

    2004-01-01

    Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.

  16. A Role-Playing Game for a Software Engineering Lab: Developing a Product Line

    ERIC Educational Resources Information Center

    Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio

    2012-01-01

    Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…

  17. The Individual Basic Facts Assessment Tool

    ERIC Educational Resources Information Center

    Tait-McCutcheon, Sandi; Drake, Michael

    2015-01-01

    There is an identified and growing need for a levelled diagnostic basic facts assessment tool that provides teachers with formative information about students' mastery of a broad range of basic fact sets. The Individual Basic Facts Assessment tool has been iteratively and cumulatively developed, trialled, and refined with input from teachers and…

  18. MicMac GIS application: free open source

    NASA Astrophysics Data System (ADS)

    Duarte, L.; Moutinho, O.; Teodoro, A.

    2016-10-01

    The use of Remotely Piloted Aerial System (RPAS) for remote sensing applications is becoming more frequent as the technologies on on-board cameras and the platform itself are becoming a serious contender to satellite and airplane imagery. MicMac is a photogrammetric tool for image matching that can be used in different contexts. It is an open source software and it can be used as a command line or with a graphic interface (for each command). The main objective of this work was the integration of MicMac with QGIS, which is also an open source software, in order to create a new open source tool applied to photogrammetry/remote sensing. Python language was used to develop the application. This tool would be very useful in the manipulation and 3D modelling of a set of images. The main objective was to create a toolbar in QGIS with the basic functionalities with intuitive graphic interfaces. The toolbar is composed by three buttons: produce the points cloud, create the Digital Elevation Model (DEM) and produce the orthophoto of the study area. The application was tested considering 35 photos, a subset of images acquired by a RPAS in the Aguda beach area, Porto, Portugal. They were used in order to create a 3D terrain model and from this model obtain an orthophoto and the corresponding DEM. The code is open and can be modified according to the user requirements. This integration would be very useful in photogrammetry and remote sensing community combined with GIS capabilities.

  19. Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis.

    PubMed

    Pfister, Alexandra; West, Alexandre M; Bronner, Shaw; Noah, Jack Adam

    2014-07-01

    Biomechanical analysis is a powerful tool in the evaluation of movement dysfunction in orthopaedic and neurologic populations. Three-dimensional (3D) motion capture systems are widely used, accurate systems, but are costly and not available in many clinical settings. The Microsoft Kinect™ has the potential to be used as an alternative low-cost motion analysis tool. The purpose of this study was to assess concurrent validity of the Kinect™ with Brekel Kinect software in comparison to Vicon Nexus during sagittal plane gait kinematics. Twenty healthy adults (nine male, 11 female) were tracked while walking and jogging at three velocities on a treadmill. Concurrent hip and knee peak flexion and extension and stride timing measurements were compared between Vicon and Kinect™. Although Kinect measurements were representative of normal gait, the Kinect™ generally under-estimated joint flexion and over-estimated extension. Kinect™ and Vicon hip angular displacement correlation was very low and error was large. Kinect™ knee measurements were somewhat better than hip, but were not consistent enough for clinical assessment. Correlation between Kinect™ and Vicon stride timing was high and error was fairly small. Variability in Kinect™ measurements was smallest at the slowest velocity. The Kinect™ has basic motion capture capabilities and with some minor adjustments will be an acceptable tool to measure stride timing, but sophisticated advances in software and hardware are necessary to improve Kinect™ sensitivity before it can be implemented for clinical use.

  20. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  2. Simulation for Wind Turbine Generators -- With FAST and MATLAB-Simulink Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, M.; Muljadi, E.; Jonkman, J.

    This report presents the work done to develop generator and gearbox models in the Matrix Laboratory (MATLAB) environment and couple them to the National Renewable Energy Laboratory's Fatigue, Aerodynamics, Structures, and Turbulence (FAST) program. The goal of this project was to interface the superior aerodynamic and mechanical models of FAST to the excellent electrical generator models found in various Simulink libraries and applications. The scope was limited to Type 1, Type 2, and Type 3 generators and fairly basic gear-train models. Future work will include models of Type 4 generators and more-advanced gear-train models with increased degrees of freedom. Asmore » described in this study, implementation of the developed drivetrain model enables the software tool to be used in many ways. Several case studies are presented as examples of the many types of studies that can be performed using this tool.« less

  3. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  4. The ASP Sensor Network: Infrastructure for the Next Generation of NASA Airborne Science

    NASA Astrophysics Data System (ADS)

    Myers, J. S.; Sorenson, C. E.; Van Gilst, D. P.; Duley, A.

    2012-12-01

    A state-of-the-art real-time data communications network is being implemented across the NASA Airborne Science Program core platforms. Utilizing onboard Ethernet networks and satellite communications systems, it is intended to maximize the science return from both single-platform missions and complex multi-aircraft Earth science campaigns. It also provides an open platform for data visualization and synthesis software tools, for use by the science instrument community. This paper will describe the prototype implementations currently deployed on the NASA DC-8 and Global Hawk aircraft, and the ongoing effort to expand the capability to other science platforms. Emphasis will be on the basic network architecture, the enabling hardware, and new standardized instrument interfaces. The new Mission Tools Suite, which provides an web-based user interface, will be also described; together with several example use-cases of this evolving technology.

  5. E-nursing documentation as a tool for quality assurance.

    PubMed

    Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros

    2006-01-01

    The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.

  6. Analysis of the temperature of the hot tool in the cut of woven fabric using infrared images

    NASA Astrophysics Data System (ADS)

    Borelli, Joao E.; Verderio, Leonardo A.; Gonzaga, Adilson; Ruffino, Rosalvo T.

    2001-03-01

    Textile manufacture occupies a prominence place in the national economy. By virtue of its importance researches have been made on the development of new materials, equipment and methods used in the production process. The cutting of textiles starts in the basic stage, to be followed within the process of the making of clothes and other articles. In the hot cutting of fabric, one of the variables of great importance in the control of the process is the contact temperature between the tool and the fabric. The work presents a technique for the measurement of the temperature based on the processing of infrared images. For this a system was developed composed of an infrared camera, a framegrabber PC board and software that analyzes the punctual temperature in the cut area enabling the operator to achieve the necessary control of the other variables involved in the process.

  7. Laws of reflection and Snell's law revisited by video modeling

    NASA Astrophysics Data System (ADS)

    Rodrigues, M.; Simeão Carvalho, P.

    2014-07-01

    Video modelling is being used, nowadays, as a tool for teaching and learning several topics in Physics. Most of these topics are related to kinematics. In this work we show how video modelling can be used for demonstrations and experimental teaching in optics, namely the laws of reflection and the well-known Snell's Law of light. Videos were recorded with a photo camera at 30 frames/s, and analysed with the open source software Tracker. Data collected from several frames was treated with the Data Tool module, and graphs were built to obtain relations between incident, reflected and refraction angles, as well as to determine the refractive index of Perspex. These videos can be freely distributed in the web and explored with students within the classroom, or as a homework assignment to improve student's understanding on specific contents. They present a large didactic potential for teaching basic optics in high school with an interactive methodology.

  8. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  9. 47 CFR 73.9007 - Robustness requirements for covered demodulator products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Digital Broadcast Television Redistribution Control § 73.9007...-available tools or equipment also means specialized electronic tools or software tools that are widely... requirements set forth in this subpart. Such specialized electronic tools or software tools includes, but is...

  10. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  11. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar.

    PubMed

    Taylor, Philip D; Brzustowski, John M; Matkovich, Carolyn; Peckford, Michael L; Wilson, Dave

    2010-10-26

    Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets.

  12. Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gazis, P. R.; Levit, C.; Way, M. J.

    2010-12-01

    Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.

  13. The Particle Physics Playground website: tutorials and activities using real experimental data

    NASA Astrophysics Data System (ADS)

    Bellis, Matthew; CMS Collaboration

    2016-03-01

    The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.

  14. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar

    PubMed Central

    2010-01-01

    Background Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Results Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Conclusions Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets. PMID:20977735

  15. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.

  16. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  17. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  18. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  19. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov Websites

    battery designers, developers, and manufacturers create affordable, high-performance lithium-ion (Li-ion Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided ) batteries for next-generation electric-drive vehicles (EDVs). An image of a simulation of a battery pack

  20. Software engineering and the role of Ada: Executive seminar

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1987-01-01

    The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.

  1. High-Level Performance Modeling of SAR Systems

    NASA Technical Reports Server (NTRS)

    Chen, Curtis

    2006-01-01

    SAUSAGE (Still Another Utility for SAR Analysis that s General and Extensible) is a computer program for modeling (see figure) the performance of synthetic- aperture radar (SAR) or interferometric synthetic-aperture radar (InSAR or IFSAR) systems. The user is assumed to be familiar with the basic principles of SAR imaging and interferometry. Given design parameters (e.g., altitude, power, and bandwidth) that characterize a radar system, the software predicts various performance metrics (e.g., signal-to-noise ratio and resolution). SAUSAGE is intended to be a general software tool for quick, high-level evaluation of radar designs; it is not meant to capture all the subtleties, nuances, and particulars of specific systems. SAUSAGE was written to facilitate the exploration of engineering tradeoffs within the multidimensional space of design parameters. Typically, this space is examined through an iterative process of adjusting the values of the design parameters and examining the effects of the adjustments on the overall performance of the system at each iteration. The software is designed to be modular and extensible to enable consideration of a variety of operating modes and antenna beam patterns, including, for example, strip-map and spotlight SAR acquisitions, polarimetry, burst modes, and squinted geometries.

  2. Motions of Celestial Bodies; Computer simulations

    NASA Astrophysics Data System (ADS)

    Butikov, Eugene

    2014-10-01

    This book is written for a wide range of graduate and undergraduate students studying various courses in physics and astronomy. It is accompanied by the award winning educational software package 'Planets and Satellites' developed by the author. This text, together with the interactive software, is intended to help students learn and understand the fundamental concepts and the laws of physics as they apply to the fascinating world of the motions of natural and artificial celestial bodies. The primary aim of the book is the understanding of the foundations of classical and modern physics, while their application to celestial mechanics is used to illustrate these concepts. The simulation programs create vivid and lasting impressions of the investigated phenomena, and provide students and their instructors with a powerful tool which enables them to explore basic concepts that are difficult to study and teach in an abstract conventional manner. Students can work with the text and software at a pace they can enjoy, varying parameters of the simulated systems. Each section of the textbook is supplied with questions, exercises, and problems. Using some of the suggested simulation programs, students have an opportunity to perform interesting mini-research projects in physics and astronomy.

  3. A software-based tool for video motion tracking in the surgical skills assessment landscape.

    PubMed

    Ganni, Sandeep; Botden, Sanne M B I; Chmarra, Magdalena; Goossens, Richard H M; Jakimowicz, Jack J

    2018-01-16

    The use of motion tracking has been proved to provide an objective assessment in surgical skills training. Current systems, however, require the use of additional equipment or specialised laparoscopic instruments and cameras to extract the data. The aim of this study was to determine the possibility of using a software-based solution to extract the data. 6 expert and 23 novice participants performed a basic laparoscopic cholecystectomy procedure in the operating room. The recorded videos were analysed using Kinovea 0.8.15 and the following parameters calculated the path length, average instrument movement and number of sudden or extreme movements. The analysed data showed that experts had significantly shorter path length (median 127 cm vs. 187 cm, p = 0.01), smaller average movements (median 0.40 cm vs. 0.32 cm, p = 0.002) and fewer sudden movements (median 14.00 vs. 21.61, p = 0.001) than their novice counterparts. The use of software-based video motion tracking of laparoscopic cholecystectomy is a simple and viable method enabling objective assessment of surgical performance. It provides clear discrimination between expert and novice performance.

  4. CBT Pilot Program Instructional Guide. Basic Drafting Skills Curriculum Delivered through CAD Workstations and Artificial Intelligence Software.

    ERIC Educational Resources Information Center

    Smith, Richard J.; Sauer, Mardelle A.

    This guide is intended to assist teachers in using computer-aided design (CAD) workstations and artificial intelligence software to teach basic drafting skills. The guide outlines a 7-unit shell program that may also be used as a generic authoring system capable of supporting computer-based training (CBT) in other subject areas. The first section…

  5. Methodology for automating software systems. Task 1 of the foundations for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1989-01-01

    The early stages of a research program designed to establish an experimental research platform for software engineering are described. Major emphasis is placed on Computer Assisted Software Engineering (CASE). The Poor Man's CASE Tool is based on the Apple Macintosh system, employing available software including Focal Point II, Hypercard, XRefText, and Macproject. These programs are functional in themselves, but through advanced linking are available for operation from within the tool being developed. The research platform is intended to merge software engineering technology with artificial intelligence (AI). In the first prototype of the PMCT, however, the sections of AI are not included. CASE tools assist the software engineer in planning goals, routes to those goals, and ways to measure progress. The method described allows software to be synthesized instead of being written or built.

  6. Test Driven Development: Lessons from a Simple Scientific Model

    NASA Astrophysics Data System (ADS)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  7. BioWord: A sequence manipulation suite for Microsoft Word

    PubMed Central

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  8. BioWord: a sequence manipulation suite for Microsoft Word.

    PubMed

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  9. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  10. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  11. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  12. Assistant Personal Robot (APR): Conception and Application of a Tele-Operated Assisted Living Robot.

    PubMed

    Clotet, Eduard; Martínez, Dani; Moreno, Javier; Tresanchez, Marcel; Palacín, Jordi

    2016-04-28

    This paper presents the technical description, mechanical design, electronic components, software implementation and possible applications of a tele-operated mobile robot designed as an assisted living tool. This robotic concept has been named Assistant Personal Robot (or APR for short) and has been designed as a remotely telecontrolled robotic platform built to provide social and assistive services to elderly people and those with impaired mobility. The APR features a fast high-mobility motion system adapted for tele-operation in plain indoor areas, which incorporates a high-priority collision avoidance procedure. This paper presents the mechanical architecture, electrical fundaments and software implementation required in order to develop the main functionalities of an assistive robot. The APR uses a tablet in order to implement the basic peer-to-peer videoconference and tele-operation control combined with a tactile graphic user interface. The paper also presents the development of some applications proposed in the framework of an assisted living robot.

  13. gSRT-Soft: a generic software application and some methodological guidelines to investigate implicit learning through visual-motor sequential tasks.

    PubMed

    Chambaron, Stéphanie; Ginhac, Dominique; Perruchet, Pierre

    2008-05-01

    Serial reaction time tasks and, more generally, the visual-motor sequential paradigms are increasingly popular tools in a variety of research domains, from studies on implicit learning in laboratory contexts to the assessment of residual learning capabilities of patients in clinical settings. A consequence of this success, however, is the increased variability in paradigms and the difficulty inherent in respecting the methodological principles that two decades of experimental investigations have made more and more stringent. The purpose of the present article is to address those problems. We present a user-friendly application that simplifies running classical experiments, but is flexible enough to permit a broad range of nonstandard manipulations for more specific objectives. Basic methodological guidelines are also provided, as are suggestions for using the software to explore unconventional directions of research. The most recent version of gSRT-Soft may be obtained for free by contacting the authors.

  14. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    2000-01-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues

  15. The change in critical technologies for computational physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1990-01-01

    It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.

  16. Computer-aided design of DNA origami structures.

    PubMed

    Selnihhin, Denis; Andersen, Ebbe Sloth

    2015-01-01

    The DNA origami method enables the creation of complex nanoscale objects that can be used to organize molecular components and to function as reconfigurable mechanical devices. Of relevance to synthetic biology, DNA origami structures can be delivered to cells where they can perform complicated sense-and-act tasks, and can be used as scaffolds to organize enzymes for enhanced synthesis. The design of DNA origami structures is a complicated matter and is most efficiently done using dedicated software packages. This chapter describes a procedure for designing DNA origami structures using a combination of state-of-the-art software tools. First, we introduce the basic method for calculating crossover positions between DNA helices and the standard crossover patterns for flat, square, and honeycomb DNA origami lattices. Second, we provide a step-by-step tutorial for the design of a simple DNA origami biosensor device, from schematic idea to blueprint creation and to 3D modeling and animation, and explain how careful modeling can facilitate later experimentation in the laboratory.

  17. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  18. Mycobacterium tuberculosis and whole genome sequencing: a practical guide and online tools available for the clinical microbiologist.

    PubMed

    Satta, G; Atzeni, A; McHugh, T D

    2017-02-01

    Whole genome sequencing (WGS) has the potential to revolutionize the diagnosis of Mycobacterium tuberculosis infection but the lack of bioinformatic expertise among clinical microbiologists is a barrier for adoption. Software products for analysis should be simple, free of charge, able to accept data directly from the sequencer (FASTQ files) and to provide the basic functionalities all-in-one. The main aim of this narrative review is to provide a practical guide for the clinical microbiologist, with little or no practical experience of WGS analysis, with a specific focus on software products tailor-made for M. tuberculosis analysis. With sequencing performed by an external provider, it is now feasible to implement WGS analysis in the routine clinical practice of any microbiology laboratory, with the potential to detect resistance weeks before traditional phenotypic culture methods, but the clinical microbiologist should be aware of the limitations of this approach. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  19. Implementing Educational Software and Evaluating Its Academic Effectiveness: Part I.

    ERIC Educational Resources Information Center

    Jolicoeur, Karen; Berger, Dale E.

    1988-01-01

    This basic plan for implementing educational software in the classroom incorporates a research design for evaluating its effectiveness. A study of fifth grade classrooms using game and tutorial software for spelling and fractions is used as an example. Topics discussed include software selection, selecting groups of comparable ability, and use of…

  20. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  1. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  2. Playing with Plug-ins

    ERIC Educational Resources Information Center

    Thompson, Douglas E.

    2013-01-01

    In today's complex music software packages, many features can remain unexplored and unused. Software plug-ins--available in most every music software package, yet easily overlooked in the software's basic operations--are one such feature. In this article, I introduce readers to plug-ins and offer tips for purchasing plug-ins I have…

  3. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  4. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  5. What Software to Use in the Teaching of Mathematical Subjects?

    ERIC Educational Resources Information Center

    Berežný, Štefan

    2015-01-01

    We can consider two basic views, when using mathematical software in the teaching of mathematical subjects. First: How to learn to use specific software for the specific tasks, e. g., software Statistica for the subjects of Applied statistics, probability and mathematical statistics, or financial mathematics. Second: How to learn to use the…

  6. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  7. Individual radiation therapy patient whole-body phantoms for peripheral dose evaluations: method and specific software.

    PubMed

    Alziar, I; Bonniaud, G; Couanet, D; Ruaud, J B; Vicente, C; Giordana, G; Ben-Harrath, O; Diaz, J C; Grandjean, P; Kafrouni, H; Chavaudra, J; Lefkopoulos, D; de Vathaire, F; Diallo, I

    2009-09-07

    This study presents a method aimed at creating radiotherapy (RT) patient-adjustable whole-body phantoms to permit retrospective and prospective peripheral dose evaluations for enhanced patient radioprotection. Our strategy involves virtual whole-body patient models (WBPM) in different RT treatment positions for both genders and for different age groups. It includes a software tool designed to match the anatomy of the phantoms with the anatomy of the actual patients, based on the quality of patient data available. The procedure for adjusting a WBPM to patient morphology includes typical dimensions available in basic auxological tables for the French population. Adjustment is semi-automatic. Because of the complexity of the human anatomy, skilled personnel are required to validate changes made in the phantom anatomy. This research is part of a global project aimed at proposing appropriate methods and software tools capable of reconstituting the anatomy and dose evaluations in the entire body of RT patients in an adapted treatment planning system (TPS). The graphic user interface is that of a TPS adapted to obtain a comfortable working process. Such WBPM have been used to supplement patient therapy planning images, usually restricted to regions involved in treatment. Here we report, as an example, the case of a patient treated for prostate cancer whose therapy planning images were complemented by an anatomy model. Although present results are preliminary and our research is ongoing, they appear encouraging, since such patient-adjusted phantoms are crucial in the optimization of radiation protection of patients and for follow-up studies.

  8. NOTE: Individual radiation therapy patient whole-body phantoms for peripheral dose evaluations: method and specific software

    NASA Astrophysics Data System (ADS)

    Alziar, I.; Bonniaud, G.; Couanet, D.; Ruaud, J. B.; Vicente, C.; Giordana, G.; Ben-Harrath, O.; Diaz, J. C.; Grandjean, P.; Kafrouni, H.; Chavaudra, J.; Lefkopoulos, D.; de Vathaire, F.; Diallo, I.

    2009-09-01

    This study presents a method aimed at creating radiotherapy (RT) patient-adjustable whole-body phantoms to permit retrospective and prospective peripheral dose evaluations for enhanced patient radioprotection. Our strategy involves virtual whole-body patient models (WBPM) in different RT treatment positions for both genders and for different age groups. It includes a software tool designed to match the anatomy of the phantoms with the anatomy of the actual patients, based on the quality of patient data available. The procedure for adjusting a WBPM to patient morphology includes typical dimensions available in basic auxological tables for the French population. Adjustment is semi-automatic. Because of the complexity of the human anatomy, skilled personnel are required to validate changes made in the phantom anatomy. This research is part of a global project aimed at proposing appropriate methods and software tools capable of reconstituting the anatomy and dose evaluations in the entire body of RT patients in an adapted treatment planning system (TPS). The graphic user interface is that of a TPS adapted to obtain a comfortable working process. Such WBPM have been used to supplement patient therapy planning images, usually restricted to regions involved in treatment. Here we report, as an example, the case of a patient treated for prostate cancer whose therapy planning images were complemented by an anatomy model. Although present results are preliminary and our research is ongoing, they appear encouraging, since such patient-adjusted phantoms are crucial in the optimization of radiation protection of patients and for follow-up studies.

  9. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  10. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  11. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  12. [Microcomputer control of a LED stimulus display device].

    PubMed

    Ohmoto, S; Kikuchi, T; Kumada, T

    1987-02-01

    A visual stimulus display system controlled by a microcomputer was constructed at low cost. The system consists of a LED stimulus display device, a microcomputer, two interface boards, a pointing device (a "mouse") and two kinds of software. The first software package is written in BASIC. Its functions are: to construct stimulus patterns using the mouse, to construct letter patterns (alphabet, digit, symbols and Japanese letters--kanji, hiragana, katakana), to modify the patterns, to store the patterns on a floppy disc, to translate the patterns into integer data which are used to display the patterns in the second software. The second software package, written in BASIC and machine language, controls display of a sequence of stimulus patterns in predetermined time schedules in visual experiments.

  13. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  14. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  15. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  16. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  17. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  18. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.

    1984-01-01

    The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.

  19. eMZed: an open source framework in Python for rapid and interactive development of LC/MS data analysis workflows.

    PubMed

    Kiefer, Patrick; Schmitt, Uwe; Vorholt, Julia A

    2013-04-01

    The Python-based, open-source eMZed framework was developed for mass spectrometry (MS) users to create tailored workflows for liquid chromatography (LC)/MS data analysis. The goal was to establish a unique framework with comprehensive basic functionalities that are easy to apply and allow for the extension and modification of the framework in a straightforward manner. eMZed supports the iterative development and prototyping of individual evaluation strategies by providing a computing environment and tools for inspecting and modifying underlying LC/MS data. The framework specifically addresses non-expert programmers, as it requires only basic knowledge of Python and relies largely on existing successful open-source software, e.g. OpenMS. The framework eMZed and its documentation are freely available at http://emzed.biol.ethz.ch/. eMZed is published under the GPL 3.0 license, and an online discussion group is available at https://groups.google.com/group/emzed-users. Supplementary data are available at Bioinformatics online.

  20. Promoter classifier: software package for promoter database analysis.

    PubMed

    Gershenzon, Naum I; Ioshikhes, Ilya P

    2005-01-01

    Promoter Classifier is a package of seven stand-alone Windows-based C++ programs allowing the following basic manipulations with a set of promoter sequences: (i) calculation of positional distributions of nucleotides averaged over all promoters of the dataset; (ii) calculation of the averaged occurrence frequencies of the transcription factor binding sites and their combinations; (iii) division of the dataset into subsets of sequences containing or lacking certain promoter elements or combinations; (iv) extraction of the promoter subsets containing or lacking CpG islands around the transcription start site; and (v) calculation of spatial distributions of the promoter DNA stacking energy and bending stiffness. All programs have a user-friendly interface and provide the results in a convenient graphical form. The Promoter Classifier package is an effective tool for various basic manipulations with eukaryotic promoter sequences that usually are necessary for analysis of large promoter datasets. The program Promoter Divider is described in more detail as a representative component of the package.

  1. Producing genome structure populations with the dynamic and automated PGS software.

    PubMed

    Hua, Nan; Tjong, Harianto; Shin, Hanjun; Gong, Ke; Zhou, Xianghong Jasmine; Alber, Frank

    2018-05-01

    Chromosome conformation capture technologies such as Hi-C are widely used to investigate the spatial organization of genomes. Because genome structures can vary considerably between individual cells of a population, interpreting ensemble-averaged Hi-C data can be challenging, in particular for long-range and interchromosomal interactions. We pioneered a probabilistic approach for the generation of a population of distinct diploid 3D genome structures consistent with all the chromatin-chromatin interaction probabilities from Hi-C experiments. Each structure in the population is a physical model of the genome in 3D. Analysis of these models yields new insights into the causes and the functional properties of the genome's organization in space and time. We provide a user-friendly software package, called PGS, which runs on local machines (for practice runs) and high-performance computing platforms. PGS takes a genome-wide Hi-C contact frequency matrix, along with information about genome segmentation, and produces an ensemble of 3D genome structures entirely consistent with the input. The software automatically generates an analysis report, and provides tools to extract and analyze the 3D coordinates of specific domains. Basic Linux command-line knowledge is sufficient for using this software. A typical running time of the pipeline is ∼3 d with 300 cores on a computer cluster to generate a population of 1,000 diploid genome structures at topological-associated domain (TAD)-level resolution.

  2. Development of a green remediation tool in Japan.

    PubMed

    Yasutaka, Tetsuo; Zhang, Hong; Murayama, Koki; Hama, Yoshihito; Tsukada, Yasuhisa; Furukawa, Yasuhide

    2016-09-01

    The green remediation assessment tool for Japan (GRATJ) presented in this study is a spreadsheet-based software package developed to facilitate comparisons of the environmental impacts associated with various countermeasures against contaminated soil in Japan. This tool uses a life-cycle assessment-based model to calculate inventory inputs/outputs throughout the activity life cycle during remediation. Processes of 14 remediation methods for heavy metal contamination and 12 for volatile organic compound contamination are built into the tool. This tool can evaluate 130 inventory inputs/outputs and easily integrate those inputs/outputs into 9 impact categories, 4 integrated endpoints, and 1 index. Comparative studies can be performed by entering basic data associated with a target site. The integrated results can be presented in a simpler and clearer manner than the results of an inventory analysis. As a case study, an arsenic-contaminated soil remediation site was examined using this tool. Results showed that the integrated environmental impacts were greater with onsite remediation methods than with offsite ones. Furthermore, the contributions of CO2 to global warming, SO2 to urban air pollution, and crude oil to resource consumption were greater than other inventory inputs/outputs. The GRATJ has the potential to improve green remediation and can serve as a valuable tool for decision makers and practitioners in selecting countermeasures in Japan. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. GPM Timeline Inhibits For IT Processing

    NASA Technical Reports Server (NTRS)

    Dion, Shirley K.

    2014-01-01

    The Safety Inhibit Timeline Tool was created as one approach to capturing and understanding inhibits and controls from IT through launch. Global Precipitation Measurement (GPM) Mission, which launched from Japan in March 2014, was a joint mission under a partnership between the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM was one of the first NASA Goddard in-house programs that extensively used software controls. Using this tool during the GPM buildup allowed a thorough review of inhibit and safety critical software design for hazardous subsystems such as the high gain antenna boom, solar array, and instrument deployments, transmitter turn-on, propulsion system release, and instrument radar turn-on. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As a result of this process, a new tool safety inhibit timeline was created for management of inhibits and their controls during spacecraft buildup and testing during IT at GSFC and at the launch range in Japan. The Safety Inhibit Timeline Tool was a pathfinder approach for reviewing software that controls the electrical inhibits. The Safety Inhibit Timeline Tool strengthens the Safety Analysts understanding of the removal of inhibits during the IT process with safety critical software. With this tool, the Safety Analyst can confirm proper safe configuration of a spacecraft during each IT test, track inhibit and software configuration changes, and assess software criticality. In addition to understanding inhibits and controls during IT, the tool allows the Safety Analyst to better communicate to engineers and management the changes in inhibit states with each phase of hardware and software testing and the impact of safety risks. Lessons learned from participating in the GPM campaign at NASA and JAXA will be discussed during this session.

  4. Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.

    2000-01-01

    The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.

  5. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  6. NCEP BUFRLIB Software User Guide

    Science.gov Websites

    Integration Branch > Decoders > BUFRLIB BUFRLIB Software User Guide This document set describes how to use the NCEP BUFRLIB software to encode or decode BUFR messages. It is not intended to be a primer on background knowledge of the basic concepts of BUFR and will focus solely on how to use the BUFRLIB software

  7. SOPanG: online text searching over a pan-genome.

    PubMed

    Cislak, Aleksander; Grabowski, Szymon; Holub, Jan

    2018-06-22

    The many thousands of high-quality genomes available nowadays imply a shift from single genome to pan-genomic analyses. A basic algorithmic building brick for such a scenario is online search over a collection of similar texts, a problem with surprisingly few solutions presented so far. We present SOPanG, a simple tool for exact pattern matching over an elastic-degenerate string, a recently proposed simplified model for the pan-genome. Thanks to bit-parallelism, it achieves pattern matching speeds above 400MB/s, more than an order of magnitude higher than of other software. SOPanG is available for free from: https://github.com/MrAlexSee/sopang. Supplementary data are available at Bioinformatics online.

  8. Improving Motor and Drive System Performance – A Sourcebook for Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This sourcebook outlines opportunities to improve motor and drive systems performance. The sourcebook is divided into four main sections: (1) Motor and Drive System Basics: Summarizes important terms, relationships, and system design considerations relating to motor and drive systems. (2) Performance Opportunity Road Map: Details the key components of well-functioning motor and drive systems and opportunities for energy performance opportunities. (3) Motor System Economics: Offers recommendations on how to propose improvement projects based on corporate priorities, efficiency gains, and financial payback periods. (4) Where to Find Help: Provides a directory of organizations associated with motors and drives, as well asmore » resources for additional information, tools, software, videos, and training opportunities.« less

  9. Measuring the RC time constant with Arduino

    NASA Astrophysics Data System (ADS)

    Pereira, N. S. A.

    2016-11-01

    In this work we use the Arduino UNO R3 open source hardware platform to assemble an experimental apparatus for the measurement of the time constant of an RC circuit. With adequate programming, the Arduino is used as a signal generator, a data acquisition system and a basic signal visualisation tool. Theoretical calculations are compared with direct observations from an analogue oscilloscope. Data processing and curve fitting is performed on a spreadsheet. The results obtained for the six RC test circuits are within the expected interval of values defined by the tolerance of the components. The hardware and software prove to be adequate to the proposed measurements and therefore adaptable to a laboratorial teaching and learning context.

  10. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  11. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand

    PubMed Central

    2018-01-01

    Background The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. Methods On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. Results All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. Conclusion These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. PMID:29441756

  12. Software Management Environment (SME) concepts and architecture, revision 1

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1992-01-01

    This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.

  13. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    PubMed

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  14. Online Assistants in Children's Hypermedia Software

    ERIC Educational Resources Information Center

    Garcia, Penny Ann

    2002-01-01

    The classroom teacher's comfort and familiarity with computers and software influences student-computer use in the classroom. Teachers remain mired in repetitive introduction of basic software mechanics and rarely progress with students to advanced concepts or complex applications. An Online Assistant (OLA) was developed to accompany the…

  15. Flowing Valued Information and Cyber-Physical Situational Awareness

    DTIC Science & Technology

    2012-01-01

    file type” constraints. The basic software supporting encryption and signing uses the OPENSSL software suite (the November 2009 version is...authorities for each organization can use OPENSSL software to generate their public and private keys. The MBTC does need to know the public or private

  16. Teaching Reprint File Management: Basic Principles and Software Programs.

    ERIC Educational Resources Information Center

    Wood, Elizabeth H.

    1989-01-01

    Describes a workshop for teaching library users how to manage reprint files which was developed at the University of Southern California Norris Medical Library. Software programs designed for this purpose are suggested, and a sidebar lists software features to consider. (eight references) (MES)

  17. Guideline for Software Documentation Management.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC.

    Designed as a basic reference for federal personnel concerned with the development, maintenance, enhancement, control, and management of computer-based systems, this manual provides a general overview of the software development process and software documentation issues so that managers can assess their own documentation requirements. Reference is…

  18. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  19. Why and how Mastering an Incremental and Iterative Software Development Process

    NASA Astrophysics Data System (ADS)

    Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe

    2004-06-01

    One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.

  20. Software for Secondary-School Learning About Robotics

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Truong, Dat; Hodgson, Terry R.

    2005-01-01

    The ROVer Ranch is an interactive computer program designed to help secondary-school students learn about space-program robotics and related basic scientific concepts by involving the students in simplified design and programming tasks that exercise skills in mathematics and science. The tasks involve building simulated robots and then observing how they behave. The program furnishes (1) programming tools that a student can use to assemble and program a simulated robot and (2) a virtual three-dimensional mission simulator for testing the robot. First, the ROVer Ranch presents fundamental information about robotics, mission goals, and facts about the mission environment. On the basis of this information, and using the aforementioned tools, the student assembles a robot by selecting parts from such subsystems as propulsion, navigation, and scientific tools, the student builds a simulated robot to accomplish its mission. Once the robot is built, it is programmed and then placed in a three-dimensional simulated environment. Success or failure in the simulation depends on the planning and design of the robot. Data and results of the mission are available in a summary log once the mission is concluded.

  1. Development of a knowledge management system for complex domains.

    PubMed

    Perott, André; Schader, Nils; Bruder, Ralph; Leonhardt, Jörg

    2012-01-01

    Deutsche Flugsicherung GmbH, the German Air Navigation Service Provider, follows a systematic approach, called HERA, for investigating incidents. The HERA analysis shows a distinctive occurrence of incidents in German air traffic control in which the visual perception of information plays a key role. The reasons can be partially traced back to workstation design, where basic ergonomic rules and principles are not sufficiently followed by the designers in some cases. In cooperation with the Institute of Ergonomics in Darmstadt the DFS investigated possible approaches that may support designers to implement ergonomic systems. None of the currently available tools were found to be able to meet the identified user requirements holistically. Therefore it was suggested to develop an enhanced software tool called Design Process Guide. The name Design Process Guide indicates that this tool exceeds the classic functions of currently available Knowledge Management Systems. It offers "design element" based access, shows processual and content related topics, and shows the implications of certain design decisions. Furthermore, it serves as documentation, detailing why a designer made to a decision under a particular set of conditions.

  2. Generation of non-genomic oligonucleotide tag sequences for RNA template-specific PCR

    PubMed Central

    Pinto, Fernando Lopes; Svensson, Håkan; Lindblad, Peter

    2006-01-01

    Background In order to overcome genomic DNA contamination in transcriptional studies, reverse template-specific polymerase chain reaction, a modification of reverse transcriptase polymerase chain reaction, is used. The possibility of using tags whose sequences are not found in the genome further improves reverse specific polymerase chain reaction experiments. Given the absence of software available to produce genome suitable tags, a simple tool to fulfill such need was developed. Results The program was developed in Perl, with separate use of the basic local alignment search tool, making the tool platform independent (known to run on Windows XP and Linux). In order to test the performance of the generated tags, several molecular experiments were performed. The results show that Tagenerator is capable of generating tags with good priming properties, which will deliberately not result in PCR amplification of genomic DNA. Conclusion The program Tagenerator is capable of generating tag sequences that combine genome absence with good priming properties for RT-PCR based experiments, circumventing the effects of genomic DNA contamination in an RNA sample. PMID:16820068

  3. Ensemble: an Architecture for Mission-Operations Software

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Powell, Mark; Fox, Jason; Rabe, Kenneth; Shu, IHsiang; McCurdy, Michael; Vera, Alonso

    2008-01-01

    Ensemble is the name of an open architecture for, and a methodology for the development of, spacecraft mission operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations- type software. Ensemble capitalizes on the strengths of the open-source Eclipse software and its architecture to address several issues that have arisen repeatedly in the development of mission-operations software: Heretofore, mission-operations application programs have been developed in disparate programming environments and integrated during the final stages of development of missions. The programs have been poorly integrated, and it has been costly to develop, test, and deploy them. Users of each program have been forced to interact with several different graphical user interfaces (GUIs). Also, the strategy typically used in integrating the programs has yielded serial chains of operational software tools of such a nature that during use of a given tool, it has not been possible to gain access to the capabilities afforded by other tools. In contrast, the Ensemble approach offers a low-risk path towards tighter integration of mission-operations software tools.

  4. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  5. Family-Based Benchmarking of Copy Number Variation Detection Software.

    PubMed

    Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael

    2015-01-01

    The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

  6. A Student Experiment Method for Learning the Basics of Embedded Software Technologies Including Hardware/Software Co-design

    NASA Astrophysics Data System (ADS)

    Kambe, Hidetoshi; Mitsui, Hiroyasu; Endo, Satoshi; Koizumi, Hisao

    The applications of embedded system technologies have spread widely in various products, such as home appliances, cellular phones, automobiles, industrial machines and so on. Due to intensified competition, embedded software has expanded its role in realizing sophisticated functions, and new development methods like a hardware/software (HW/SW) co-design for uniting HW and SW development have been researched. The shortfall of embedded SW engineers was estimated to be approximately 99,000 in the year 2006, in Japan. Embedded SW engineers should understand HW technologies and system architecture design as well as SW technologies. However, a few universities offer this kind of education systematically. We propose a student experiment method for learning the basics of embedded system development, which includes a set of experiments for developing embedded SW, developing embedded HW and experiencing HW/SW co-design. The co-design experiment helps students learn about the basics of embedded system architecture design and the flow of designing actual HW and SW modules. We developed these experiments and evaluated them.

  7. Estimating Computer-Based Training Development Times

    DTIC Science & Technology

    1987-10-14

    beginners , must be sure they interpret terms correctly. As a result of this informal validation, the authors suggest refinements in the tool which...Productivity tools available: automated design tools, text processor interfaces, flowcharting software, software interfaces a Multimedia interfaces e

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svetlana Shasharina

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  9. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software

    PubMed Central

    Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-01

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801

  10. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    PubMed

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  11. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    NASA Technical Reports Server (NTRS)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  12. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  13. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  14. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    PubMed

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  15. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    PubMed

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    In the present paper, the novel software GTest is introduced, designed for testing the normality of a user-specified empirical distribution. It has been implemented with two unusual characteristics; the first is the user option of selecting four different versions of the normality test, each of them suited to be applied to a specific dataset or goal, and the second is the inferential paradigm that informs the output of such tests: it is basically graphical and intrinsically self-explanatory. The concept of inference-by-eye is an emerging inferential approach which will find a successful application in the near future due to the growing need of widening the audience of users of statistical methods to people with informal statistical skills. For instance, the latest European regulation concerning environmental issues introduced strict protocols for data handling (data quality assurance, outliers detection, etc.) and information exchange (areal statistics, trend detection, etc.) between regional and central environmental agencies. Therefore, more and more frequently, laboratory and field technicians will be requested to utilize complex software applications for subjecting data coming from monitoring, surveying or laboratory activities to specific statistical analyses. Unfortunately, inferential statistics, which actually influence the decisional processes for the correct managing of environmental resources, are often implemented in a way which expresses its outcomes in a numerical form with brief comments in a strict statistical jargon (degrees of freedom, level of significance, accepted/rejected H0, etc.). Therefore, often, the interpretation of such outcomes is really difficult for people with poor statistical knowledge. In such framework, the paradigm of the visual inference can contribute to fill in such gap, providing outcomes in self-explanatory graphical forms with a brief comment in the common language. Actually, the difficulties experienced by colleagues and their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  16. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  17. Security Risks: Management and Mitigation in the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.

    2004-01-01

    A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.

  18. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  19. The Software Management Environment (SME)

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  20. A study of the academic performance of medical students in the comprehensive examination of the basic sciences according to the indices of emotional intelligence and educational status.

    PubMed

    Moslehi, Mohsen; Samouei, Rahele; Tayebani, Tayebeh; Kolahduz, Sima

    2015-01-01

    Considering the increasing importance of emotional intelligence (EI) in different aspects of life, such as academic achievement, the present survey is aimed to predict academic performance of medical students in the comprehensive examination of the basic sciences, according to the indices of emotional intelligence and educational status. The present survey is a descriptive, analytical, and cross-sectional study performed on the medical students of Isfahan, Tehran, and Mashhad Universities of Medical Sciences. Sampling the universities was performed randomly after which selecting the students was done, taking into consideration the limitation in their numbers. Based on the inclusion criteria, all the medical students, entrance of 2005, who had attended the comprehensive basic sciences examination in 2008, entered the study. The data collection tools included an Emotional Intelligence Questionnaire (standardized in Isfahan), the average score of the first to fifth semesters, total average of each of the five semesters, and the grade of the comprehensive basic sciences examination. The data were analyzed through stepwise regression coefficient by SPSS software version 15. The results indicated that the indicators of independence from an emotional intelligence test and average scores of the first and third academic semesters were significant in predicting the students' academic performance in the comprehensive basic sciences examination. According to the obtained results, the average scores of students, especially in the earlier semesters, as well as the indicators of independence and the self-esteem rate of students can influence their success in the comprehensive basic sciences examination.

  1. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner. Through agile testing, there has been increased value through individuals and interactions over processes and tools, improved customer collaboration, and improved responsiveness to changes through controlled planning. The presentation will describe agile testing methodology as taken with the SLS FSW Test and Verification team at Marshall Space Flight Center.

  2. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  3. SUNREL Related Links | Buildings | NREL

    Science.gov Websites

    SUNREL Related Links SUNREL Related Links DOE Simulation Software Tools Directory a directory of 301 building software tools for evaluation of energy efficiency, renewable energy, and sustainability in buildings. TREAT Software Program a computer program that uses SUNREL and is designed to provide

  4. IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.

    ERIC Educational Resources Information Center

    Sheehan, Mark C.; Williams, James G.

    1987-01-01

    Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)

  5. The Holistic Targeting (HOT) Methodology as the Means to Improve Information Operations (IO) Target Development and Prioritization

    DTIC Science & Technology

    2008-09-01

    software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1

  6. Validation of Tendril TrueHome Using Software-to-Software Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan

    This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.

  7. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  8. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  9. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  10. Runtime Performance Monitoring Tool for RTEMS System Software

    NASA Astrophysics Data System (ADS)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  11. VSO For Dummies

    NASA Astrophysics Data System (ADS)

    Schwartz, Richard A.; Zarro, D.; Csillaghy, A.; Dennis, B.; Tolbert, A. K.; Etesi, L.

    2009-05-01

    We report on our activities to integrate VSO search and retrieval capabilities into standard data access, display, and analysis tools. In addition to its standard Web-based search form, the VSO provides an Interactive Data Language (IDL) client (vso_search) that is available through the Solar Software (SSW) package. We have incorporated this client into an IDL-widget interface program (show_synop) that allows for more simplified searching and downloading of VSO datasets directly into a user's IDL data analysis environment. In particular, we have provided the capability to read VSO datasets into a general purpose IDL package (plotman) that can display different datatypes (lightcurves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. Currently, the show_synop tool supports access to ground-based and space-based (SOHO, STEREO, and Hinode) observations, and has the capability to include new datasets as they become available. A user encounters two major hurdles when using the VSO: (1) Instrument-specific software (such as level-0 file readers and data-prepping procedures) may not be available in the user's local SSW distribution. (2) Recent calibration files (such as flat-fields) are not automatically distributed with the analysis software. To address these issues, we have developed a dedicated server (prepserver) that incorporates all the latest instrument-specific software libraries and calibration files. The prepserver uses an IDL-Java bridge to read and implement data processing requests from a client and return a processed data file that can be readily displayed with the show_synop/plotman package. The advantage of the prepserver is that the user is only required to install the general branch (gen) of the SSW tree, and is freed from the more onerous task of installing instrument-specific libraries and calibration files. We will demonstrate how the prepserver can be used to read, process, and overlay SOHO/EIT, TRACE, SECCHI/EUVI, and RHESSI images.

  12. Browsing Software of the Visible Korean Data Used for Teaching Sectional Anatomy

    ERIC Educational Resources Information Center

    Shin, Dong Sun; Chung, Min Suk; Park, Hyo Seok; Park, Jin Seo; Hwang, Sung Bae

    2011-01-01

    The interpretation of computed tomographs (CTs) and magnetic resonance images (MRIs) to diagnose clinical conditions requires basic knowledge of sectional anatomy. Sectional anatomy has traditionally been taught using sectioned cadavers, atlases, and/or computer software. The computer software commonly used for this subject is practical and…

  13. Software Engineering Basics: A Primer for the Project Manager.

    DTIC Science & Technology

    1982-06-01

    computer software (45, 46]. It is named after Ada Augusta who is generally credited as having been the first programmer as an assistant to Charles ... Babbage , and is called, appropriately enough, ADA. The development of one common programming language for tactical software clearly has the p-.tential for

  14. Use of Software Tools in Teaching Relational Database Design.

    ERIC Educational Resources Information Center

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  15. Analyzing the Core Flight Software (CFS) with SAVE

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David

    2008-01-01

    This viewgraph presentation describes the SAVE tool and it's application to Core Flight Software (CFS). The contents include: 1) Fraunhofer-a short intro; 2) Context of this Collaboration; 3) CFS-Core Flight Software?; 4) The SAVE Tool; 5) Applying SAVE to CFS -A few example analyses; and 6) Goals.

  16. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    ERIC Educational Resources Information Center

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  17. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  18. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    NASA Astrophysics Data System (ADS)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  19. Software engineering from a Langley perspective

    NASA Technical Reports Server (NTRS)

    Voigt, Susan

    1994-01-01

    A brief introduction to software engineering is presented. The talk is divided into four sections beginning with the question 'What is software engineering', followed by a brief history of the progression of software engineering at the Langley Research Center in the context of an expanding computing environment. Several basic concepts and terms are introduced, including software development life cycles and maturity levels. Finally, comments are offered on what software engineering means for the Langley Research Center and where to find more information on the subject.

  20. Payload software technology

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.

Top