Science.gov

Sample records for addition software tools

  1. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  2. Modern Tools for Modern Software

    SciTech Connect

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  3. Software Quality Tools

    DTIC Science & Technology

    1988-05-04

    data base name mate qa tool - tare and lcsc 1 * no. instruments * $ ftim * instrument name * sensor * system designator * 1 * no. nouns* ac signal...PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (if applicable) Fk ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM...PROJECT TASK WORK UNIT ELEMENT NO. NO. NO ACCESSION NO. 11. TITLE (Include Security Classification) Software Quality Tools 12. PERSONAL AUTHOR(S

  4. Software Tool Issues

    NASA Astrophysics Data System (ADS)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  5. User Interface Software Tools

    DTIC Science & Technology

    1994-08-01

    97. 19. Mark A. Flecchia and R. Daniel Bergeron. Specifying Complex Dialogs in ALGAE. Human Factors in Computing Systems, CHI+GI󈨛, Toronto, Ont...Spreadsheet Model. Tech. Rept. GIT-GVU-93-20, Georgia Tech Graphics, Visualization and Usability Center, May, 1993. 35. Daniel H.H. Ingalls. "I’he Smalltalk...Interactive Graphical Applications". Comm. ACM 36,4 (April 1993), 41-55. User Interface Software Tools -39 38. Anthony Karrer and Walt Scacchi . Requirements

  6. CSAM Metrology Software Tool

    NASA Technical Reports Server (NTRS)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  7. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  8. Biological imaging software tools.

    PubMed

    Eliceiri, Kevin W; Berthold, Michael R; Goldberg, Ilya G; Ibáñez, Luis; Manjunath, B S; Martone, Maryann E; Murphy, Robert F; Peng, Hanchuan; Plant, Anne L; Roysam, Badrinath; Stuurman, Nico; Stuurmann, Nico; Swedlow, Jason R; Tomancak, Pavel; Carpenter, Anne E

    2012-06-28

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the inherent challenges and the overall status of available software for bioimage informatics, focusing on open-source options.

  9. Biological Imaging Software Tools

    PubMed Central

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  10. Software Tools: EPICUR.

    ERIC Educational Resources Information Center

    Abreu, Jose Luis; And Others

    EPICUR (Integrated Programing Environment for the Development of Educational Software) is a set of programming modules ranging from low level interfaces to high level algorithms aimed at the development of computer-assisted instruction (CAI) applications. The emphasis is on user-friendly interfaces and on multiplying productivity without loss of…

  11. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  12. Software Tools for Shipbuilding Productivity

    DTIC Science & Technology

    1984-12-01

    10.3.3.1 User Output 10.3.3.2 Machine Output CONTENTS (Cont.) Page 10.4 Software Tools and Required Environment 10.5 Software Tool Availability 10.6...of ‘old code’ to new machines and systems cost- e f fect ively. o Mediational Utility Methods. Methodologies and tools with the ability to create an... machine tools , and ships. The Discrete Batch Manufacturing environment characterizes the majority of manufacturing tasks in the United States. I t i s

  13. STE - The Software Tools Editor

    NASA Astrophysics Data System (ADS)

    Software tools is an excellent book written by B. W. Kernighan and P. J. Plauger, published by Addison-Wesley. In it the authors discuss how to write programs that make good tools, and how to program well in the process. One of the tools they develop is a fairly powerful editor, written in Ratfor (a structured form of FORTRAN IV). This program has been implemented on the UCL Starlink VAX (with a few modifications and extensions) and is recommended as the editor to use on the VAX. This note gives a brief introduction to, and description of, the editor which has been abstracted from the book (which you are recommended to buy). There are some short command summary sections at the end of this note. After reading this note you may like to print these short files and use them for reference when using the editor.

  14. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  15. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  16. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  17. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  18. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  19. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  20. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  1. Software development tools: A bibliography, appendix C.

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.

    1980-01-01

    A bibliography containing approximately 200 citations on tools which help software developers perform some development task (such as text manipulation, testing, etc.), and which would not necessarily be found as part of a computing facility is given. The bibliography comes from a relatively random sampling of the literature and is not complete. But it is indicative of the nature and range of tools currently being prepared or currently available.

  2. Software and tools for microarray data analysis.

    PubMed

    Mehta, Jai Prakash; Rani, Sweta

    2011-01-01

    A typical microarray experiment results in series of images, depending on the experimental design and number of samples. Software analyses the images to obtain the intensity at each spot and quantify the expression for each transcript. This is followed by normalization, and then various data analysis techniques are applied on the data. The whole analysis pipeline requires a large number of software to accurately handle the massive amount of data. Fortunately, there are large number of freely available and commercial software to churn the massive amount of data to manageable sets of differentially expressed genes, functions, and pathways. This chapter describes the software and tools which can be used to analyze the gene expression data right from the image analysis to gene list, ontology, and pathways.

  3. Software for systems biology: from tools to integrated platforms.

    PubMed

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2011-11-03

    Understanding complex biological systems requires extensive support from software tools. Such tools are needed at each step of a systems biology computational workflow, which typically consists of data handling, network inference, deep curation, dynamical simulation and model analysis. In addition, there are now efforts to develop integrated software platforms, so that tools that are used at different stages of the workflow and by different researchers can easily be used together. This Review describes the types of software tools that are required at different stages of systems biology research and the current options that are available for systems biology researchers. We also discuss the challenges and prospects for modelling the effects of genetic changes on physiology and the concept of an integrated platform.

  4. SUSTAINABLE REMEDIATION SOFTWARE TOOL EXERCISE AND EVALUATION

    SciTech Connect

    Kohn, J.; Nichols, R.; Looney, B.

    2011-05-12

    The goal of this study was to examine two different software tools designed to account for the environmental impacts of remediation projects. Three case studies from the Savannah River Site (SRS) near Aiken, SC were used to exercise SiteWise (SW) and Sustainable Remediation Tool (SRT) by including both traditional and novel remediation techniques, contaminants, and contaminated media. This study combined retrospective analysis of implemented projects with prospective analysis of options that were not implemented. Input data were derived from engineering plans, project reports, and planning documents with a few factors supplied from calculations based on Life Cycle Assessment (LCA). Conclusions drawn from software output were generally consistent within a tool; both tools identified the same remediation options as the 'best' for a given site. Magnitudes of impacts varied between the two tools, and it was not always possible to identify the source of the disagreement. The tools differed in their quantitative approaches: SRT based impacts on specific contaminants, media, and site geometry and modeled contaminant removal. SW based impacts on processes and equipment instead of chemical modeling. While SW was able to handle greater variety in remediation scenarios, it did not include a measure of the effectiveness of the scenario.

  5. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  6. A software tool for dataflow graph scheduling

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1994-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.

  7. Westinghouse waste simulation and optimization software tool

    SciTech Connect

    Mennicken, Kim; Aign, Jorg

    2013-07-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner, taking

  8. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  9. Software Tools for Acoustic Database Management

    DTIC Science & Technology

    1992-01-01

    key system requirements must be satis- fied. This report discusses these requirements, and describes the software tools developed by the WHOI...Source code listings are supplied. I I I I I I I I I I I I I I I I I I I ii I I I I Contents 1 1 Introduction 3 2 System Overview 5 3 3 Digital File...Format 7 4 Analog-to-Digital Conversion 9 4.1 CSTRM: A Batch Digitizing System .................. 11 5 Text Header Maintenance 15 5.1 HEADEDIT

  10. Additive manufacturing of tools for lapping glass

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2013-09-01

    Additive manufacturing technologies have the ability to directly produce parts with complex geometries without the need for secondary processes, tooling or fixtures. This ability was used to produce concave lapping tools with a VFlash 3D printer from 3D Systems. The lapping tools were first designed in Creo Parametric with a defined constant radius and radial groove pattern. The models were converted to stereolithography files which the VFlash used in building the parts, layer by layer, from a UV curable resin. The tools were rotated at 60 rpm and used with 120 grit and 220 grit silicon carbide lapping paste to lap 0.750" diameter fused silica workpieces. The samples developed a matte appearance on the lapped surface that started as a ring at the edge of the workpiece and expanded to the center. This indicated that as material was removed, the workpiece radius was beginning to match the tool radius. The workpieces were then cleaned and lapped on a second tool (with equivalent geometry) using a 3000 grit corundum aluminum oxide lapping paste, until a near specular surface was achieved. By using lapping tools that have been additively manufactured, fused silica workpieces can be lapped to approach a specified convex geometry. This approach may enable more rapid lapping of near net shape workpieces that minimize the material removal required by subsequent polishing. This research may also enable development of new lapping tool geometry and groove patterns for improved loose abrasive finishing.

  11. SAS: a yield/failure analysis software tool

    NASA Astrophysics Data System (ADS)

    de Jong Perez, Susana

    1996-09-01

    As the device sizes decrease and the number of interconnect levels and wafer size increase, the device yield and failure analysis becomes more complex. Currently, software tools are being used to perform visual inspection techniques after many operations during which defects are detected on a sample of wafers. However, it has been observed that the correlation between the yield predicted on the basis of the defects found during such observations and the yield determined electrically at wafer final test is low. Of a greater interest to yield/failure analysis software tools is statistical analysis software. SASTM can perform extensive data analysis on kerf test structures' electrical parameters. In addition, the software can merge parametric and yield/fail bins data which reduces the data collection and data reduction activities involved in the correlation of device parameters to circuit functional operation. The data is saved in large databases which allow storage and later retrieval of historical data in order to evaluate process shifts and changes and their effect on yield. The merge of process parameters and on-line measurements with final electrical data, is also possible with the aid of process parameter extraction software. All of this data analysis provides excellent feedback about integrated circuit wafer processing.

  12. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  13. Tool support for software lookup table optimization

    PubMed Central

    Strout, Michelle Mills; Bieman, James M.

    2012-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  14. Tool Support for Software Lookup Table Optimization

    DOE PAGES

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  15. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    SciTech Connect

    Not Available

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  16. Integrated software tool automates MOV diagnosis

    SciTech Connect

    Joshi, B.D.; Upadhyaya, B.R.

    1996-04-01

    This article reports that researchers at the University of Tennessee have developed digital signal processing software that takes the guesswork out of motor current signature analysis (MCSA). The federal testing regulations for motor-operated valves (MOV) used in nuclear power plants have recently come under critical scrutiny by the Nuclear Regulatory Commission (NRC) and the American Society of Mechanical Engineers (ASME). New ASME testing specifications mandate that all valves performing a safety function are to be tested -- not just ASME Code 1, 2 and 3 valves. The NRC will likely endorse the ASME regulations in the near future. Because of these changes, several utility companies have voluntarily expanded the scope of their in-service testing programs for MOVs, in spite of the additional expense.

  17. Software Metrics Useful Tools or Wasted Measurements

    DTIC Science & Technology

    1990-05-01

    shared by the developers of the field of software metrics. Capers Jones, Chairman of Software Productivity Research, Inc. and a noted pioneer in...development efforts in terms of function points. That will give you a basis for measuring productivity. Capers Jones, chairman of Software... Capers Jones, "Building a better metric," Computerworld Extra, 22 (June 20, 1988):39. 24 ALlen J. Albrecht and John E. Gaffney, Jr., "Software Function

  18. A Taxonomy of Knowledge Management Software Tools: Origins and Applications.

    ERIC Educational Resources Information Center

    Tyndale, Peter

    2002-01-01

    Examines, evaluates, and organizes a wide variety of knowledge management software tools by examining the literature related to the selection and evaluation of knowledge management tools. (Author/SLD)

  19. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  20. A Tool for Managing Software Architecture Knowledge

    SciTech Connect

    Babar, Muhammad A.; Gorton, Ian

    2007-08-01

    This paper describes a tool for managing architectural knowledge and rationale. The tool has been developed to support a framework for capturing and using architectural knowledge to improve the architecture process. This paper describes the main architectural components and features of the tool. The paper also provides examples of using the tool for supporting wellknown architecture design and analysis methods.

  1. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  2. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  3. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  4. Tools Ensure Reliability of Critical Software

    NASA Technical Reports Server (NTRS)

    2012-01-01

    In November 2006, after attempting to make a routine maneuver, NASA's Mars Global Surveyor (MGS) reported unexpected errors. The onboard software switched to backup resources, and a 2-day lapse in communication took place between the spacecraft and Earth. When a signal was finally received, it indicated that MGS had entered safe mode, a state of restricted activity in which the computer awaits instructions from Earth. After more than 9 years of successful operation gathering data and snapping pictures of Mars to characterize the planet's land and weather communication between MGS and Earth suddenly stopped. Months later, a report from NASA's internal review board found the spacecraft's battery failed due to an unfortunate sequence of events. Updates to the spacecraft's software, which had taken place months earlier, were written to the wrong memory address in the spacecraft's computer. In short, the mission ended because of a software defect. Over the last decade, spacecraft have become increasingly reliant on software to carry out mission operations. In fact, the next mission to Mars, the Mars Science Laboratory, will rely on more software than all earlier missions to Mars combined. According to Gerard Holzmann, manager at the Laboratory for Reliable Software (LaRS) at NASA's Jet Propulsion Laboratory (JPL), even the fault protection systems on a spacecraft are mostly software-based. For reasons like these, well-functioning software is critical for NASA. In the same year as the failure of MGS, Holzmann presented a new approach to critical software development to help reduce risk and provide consistency. He proposed The Power of 10: Rules for Developing Safety-Critical Code, which is a small set of rules that can easily be remembered, clearly relate to risk, and allow compliance to be verified. The reaction at JPL was positive, and developers in the private sector embraced Holzmann's ideas.

  5. EISA 432 Energy Audits Best Practices: Software Tools

    SciTech Connect

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  6. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  7. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  8. NASA Approach to HPCCP Support Software and Tools

    NASA Technical Reports Server (NTRS)

    Blaylock, Bruce; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    The NASA HPCC Program, together with other agencies participating in the Federal HPCC Program, intends to advance technologies to enable the execution of grand challenge applications at sustained rates up to TeraFLOPS. During 1995-6 NASA undertook two major systems software efforts to improve the state of high performance support software and tools. The first of these activities was a replanning of support software and tools activities internal to the Agency. In replanning the software activities emphasis was placed on Meeting the needs of Grand Challenge Uses Few projects Near term useful results. The revised NASA plan calls for support software and tools activities in four areas: Application Creation Process Support Application Usage/Operations Support Advanced Support Software and Tools Concepts Metrics Based Monitoring and Management The second major activity undertaken was participation in a multiagency Task Force resulting from the Second Pasadena Workshop on System Software and Tools. The task force developed the Guidelines for Writing System Software and Tools Requirements for Parallel and Clustered Computers.

  9. Innovative Software Tools Measure Behavioral Alertness

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  10. An Evaluation Format for "Open" Software Tools.

    ERIC Educational Resources Information Center

    Murphy, Cheryl A.

    1995-01-01

    Evaluates six "open" (empty of content and customized by users) software programs using the literature-based characteristics of documentation, learner control, branching capabilities, portability, ease of use, and cost-effectiveness. Interviewed computer-knowledgeable individuals to confirm the legitimacy of the evaluative characteristics. (LRW)

  11. The Toxicity Estimation Software Tool (T.E.S.T.)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to estimate toxicological values for aquatic and mammalian species considering acute and chronic endpoints for screening purposes within TSCA and REACH programs.

  12. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  13. Developing a Decision Support System: The Software and Hardware Tools.

    ERIC Educational Resources Information Center

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  14. ISWHM: Tools and Techniques for Software and System Health Management

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  15. NASA-Enhanced Version Of Automatically Programmed Tool Software (APT)

    NASA Technical Reports Server (NTRS)

    Purves, L. R.

    1989-01-01

    APT code one of most widely used software tools for complex numerically-controlled machining. Both a programming language and software that processes language. Upgrades include super pocket for concave polygon pockets and editor to reprocess cutter location coordinates according to user-supplied commands.

  16. Software tool for xenon gamma-ray spectrometer control

    NASA Astrophysics Data System (ADS)

    Chernysheva, I. V.; Novikov, A. S.; Shustov, A. E.; Dmitrenko, V. V.; Pyae Nyein, Sone; Petrenko, D.; Ulin, S. E.; Uteshev, Z. M.; Vlasik, K. F.

    2016-02-01

    Software tool "Acquisition and processing of gamma-ray spectra" for xenon gamma-ray spectrometers control was developed. It supports the multi-windows interface. Software tool has the possibilities for acquisition of gamma-ray spectra from xenon gamma-ray detector via USB or RS-485 interfaces, directly or via TCP-IP protocol, energy calibration of gamma-ray spectra, saving gamma-ray spectra on a disk.

  17. iPhone examination with modern forensic software tools

    NASA Astrophysics Data System (ADS)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  18. A Software-System Visualization Tool

    DTIC Science & Technology

    1990-04-01

    databases. servers) and the dependencies established between these modules (e.g., procedure calls, message passing channels). \\ Ve use the tool to write...A MINION user can view these implicitly constructed connections and notice incorrect links to library modules and "regions" of a graph. We can scale ...configuration programming systems. MINION handles this problem by hiding unimportant details or scaling them graphically. We can graphically select

  19. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  20. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  1. Concepts and tools for the software life cycle

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    The tools, techniques, and aids needed to engineer, manage, and administer a large software-intensive task are themselves parts of a large software base, and are incurred only at great expense. The needs of the software life cycle in terms of such supporting tools and methodologies are highlighted. The concept of a distributed network for engineering, management, and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software actively are discussed. A formal, deliberate, structured, systems-engineering approach for the construction of a uniform, coordinated tool set is proposed as a means to reduce development and maintenance costs, foster adaptability, enhance reliability, and promote standardization.

  2. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  3. Software for Use with Optoelectronic Measuring Tool

    NASA Technical Reports Server (NTRS)

    Ballard, Kim C.

    2004-01-01

    A computer program has been written to facilitate and accelerate the process of measurement by use of the apparatus described in "Optoelectronic Tool Adds Scale Marks to Photographic Images" (KSC-12201). The tool contains four laser diodes that generate parallel beams of light spaced apart at a known distance. The beams of light are used to project bright spots that serve as scale marks that become incorporated into photographic images (including film and electronic images). The sizes of objects depicted in the images can readily be measured by reference to the scale marks. The computer program is applicable to a scene that contains the laser spots and that has been imaged in a square pixel format that can be imported into a graphical user interface (GUI) generated by the program. It is assumed that the laser spots and the distance(s) to be measured all lie in the same plane and that the plane is perpendicular to the line of sight of the camera used to record the image

  4. Forecasting trends in NASA flight software development tools

    NASA Technical Reports Server (NTRS)

    Garman, J. R.

    1983-01-01

    The experience gained in the design and development of Shuttle flight and ground support embedded software systems along with projections of increasing role and size of software in the proposed Space Station and other future NASA projects provides the basis for forecasting substantial changes in the tools and methodologies by which embedded software systems are developed and acquired. Similar changes in software architectures and operator interfaces will lead to substantial changes in the approach and techniques involved in software test and system integration. Increasing commonality among different flight systems and between flight and supporting ground systems is projected, along with a more distributed approach to software acquisition in highly complex projects such as Space Station.

  5. HANSIS software tool for the automated analysis of HOLZ lines.

    PubMed

    Holec, D; Sridhara Rao, D V; Humphreys, C J

    2009-06-01

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  6. Talkoot: software tool to create collaboratories for earth science

    SciTech Connect

    Movva, Sunil; Ramachandran, Rahul; Maskey, Manil; Kulkarni, Ajinkya; Conover, Helen; Nair, U.S.

    2012-01-01

    Open science, where researchers share and publish every element of their research process in addition to the final results, can foster novel ways of collaboration among researchers and has the potential to spontaneously create new virtual research collaborations. Based on scientific interest, these new virtual research collaborations can cut across traditional boundaries such as institutions and organizations. Advances in technology allow for software tools that can be used by different research groups and institutions to build and support virtual collaborations and infuse open science. This paper describes Talkoot, a software toolkit designed and developed by the authors to provide Earth Science researchers a ready-to-use knowledge management environment and an online platform for collaboration. Talkoot allows Earth Science researchers a means to systematically gather, tag and share their data, analysis workflows and research notes. These Talkoot features are designed to foster rapid knowledge sharing within a virtual community. Talkoot can be utilized by small to medium sized groups and research centers, as well as large enterprises such a national laboratories and federal agencies.

  7. Mission-Clock-Display Software Tool

    NASA Technical Reports Server (NTRS)

    Aguilera, Christine; Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria P.

    1993-01-01

    Displays including images of alarm clocks illustrate temporal statuses of multiple events. MCLK is customizable clock-display computer program with Motif user interface. Used to keep track of such multiple "milestone" events as those occurring during countdowns in spacecraft launches, and alerts user when event time reached. In addition, program displays time from several time zones. Real time measured in Coordinated Universal Time. Written in C language.

  8. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  9. Exoskeletons, Robots and System Software: Tools for the Warfighter

    DTIC Science & Technology

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots, drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  10. Concepts and Tools for the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    The tools, techniques, and aids needed to engineer, manage, and administer a large software-intensive task are themselves parts of a large softwaare base, and are incurred only at great expense. The needs of the software life cycle in terms of such supporting tools and methodologies are highlighted. The concept of a distributed network for engineering, management, and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineering approach for the construction of a uniform, coordinated tool set is proposed as a means to reduce development and maintenance costs, foster adaptability, enhance reliability, and promote standardization.

  11. Management of Astronomical Software Projects with Open Source Tools

    NASA Astrophysics Data System (ADS)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  12. Software Tools in Endoscopy - Nice to Have or Essential?

    PubMed Central

    Möschler, Oliver

    2016-01-01

    Background Documentation of findings and of the treatment implications resulting from them is one of the central tasks involved in medical work. The introduction of software tools for managing and providing technical support for this task is a logical development. Methods A literature search was conducted in September 2015 using PubMed and the search terms ‘gastrointestinal endoscopy AND electronic documentation’ and ‘software tools AND gastrointestinal endoscopy AND documentation’. Results The requirements in relation to documentation, patient information and sedation, dealing with histological findings, materials logistics, recording video documents, and hygiene documentation are discussed. Conclusion Software tools are essential for managing basic documentation requirements. However, for many aspects of the documentation required in a modern endoscopy department, there are various - and sometimes substantial - gaps in the programs currently available. More intensive discussions need to take place regarding existing gaps and requirements, both with the suppliers concerned and among colleagues and specialist societies. PMID:27588294

  13. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  14. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  15. Case Studies of Software Development Tools for Parallel Architectures

    DTIC Science & Technology

    1993-06-01

    RL-TR-93-114 Final Technical Report AD-A269 193I M N11 Nal I U l iE rr ll Hllll CASE STUDIES OF SOFTWARE DEVELOPMENT TOOLS FOR PARALLEL ARCHITECTURES...65 Om ega/ PegaSys ..................................................................................... 66 PARET...Pisces Rn BALSA II TANGO PARET VMMP Omega/ PegaSys PSG POKER ISSOS Unity -4- PADWB Schedule Tool Degn Graph= Alg I/gr- Sol Pormbil- Ptform Pan/don Debug

  16. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  17. Proposing a Mathematical Software Tool in Physics Secondary Education

    ERIC Educational Resources Information Center

    Baltzis, Konstantinos B.

    2009-01-01

    MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…

  18. Role of Social Software Tools in Education: A Literature Review

    ERIC Educational Resources Information Center

    Minocha, Shailey

    2009-01-01

    Purpose: The purpose of this paper is to provide a review of literature on the role of Web 2.0 or social software tools in education. Design/methodology/approach: This paper is a critical and comprehensive review of a range of literature sources (until January 2009) addressing the various issues related to the educator's perspective of pedagogical…

  19. Simple tools and software for precision weed mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Simple Tools and Software for Precision Weed Mapping L. Wiles If you have a color digital camera and a handheld GPS unit, you can map weed problems in your fields. German researchers are perfecting technology to map weed species and density with digital cameras for precision herbicide application. ...

  20. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  1. Chips: A Tool for Developing Software Interfaces Interactively.

    ERIC Educational Resources Information Center

    Cunningham, Robert E.; And Others

    This report provides a detailed description of Chips, an interactive tool for developing software employing graphical/computer interfaces on Xerox Lisp machines. It is noted that Chips, which is implemented as a collection of customizable classes, provides the programmer with a rich graphical interface for the creation of rich graphical…

  2. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  3. GenePRIMP: A software quality control tool

    SciTech Connect

    Amrita Pati

    2010-05-05

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  4. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  5. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2016-07-12

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  6. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  7. Software tools for visualizing Hi-C data.

    PubMed

    Yardımcı, Galip Gürkan; Noble, William Stafford

    2017-02-03

    High-throughput assays for measuring the three-dimensional (3D) configuration of DNA have provided unprecedented insights into the relationship between DNA 3D configuration and function. Data interpretation from assays such as ChIA-PET and Hi-C is challenging because the data is large and cannot be easily rendered using standard genome browsers. An effective Hi-C visualization tool must provide several visualization modes and be capable of viewing the data in conjunction with existing, complementary data. We review five software tools that do not require programming expertise. We summarize their complementary functionalities, and highlight which tool is best equipped for specific tasks.

  8. An Analysis of Adenovirus Genomes Using Whole Genome Software Tools

    PubMed Central

    Mahadevan, Padmanabhan

    2016-01-01

    The evolution of sequencing technology has lead to an enormous increase in the number of genomes that have been sequenced. This is especially true in the field of virus genomics. In order to extract meaningful biological information from these genomes, whole genome data mining software tools must be utilized. Hundreds of tools have been developed to analyze biological sequence data. However, only some of these tools are user-friendly to biologists. Several of these tools that have been successfully used to analyze adenovirus genomes are described here. These include Artemis, EMBOSS, pDRAW, zPicture, CoreGenes, GeneOrder, and PipMaker. These tools provide functionalities such as visualization, restriction enzyme analysis, alignment, and proteome comparisons that are extremely useful in the bioinformatics analysis of adenovirus genomes. PMID:28293072

  9. A Software Communication Tool for the Tele-ICU

    PubMed Central

    Pimintel, Denise M.; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider. PMID:24551398

  10. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  11. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  12. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  13. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  14. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.

    1990-01-01

    A prototype is described that can serve as a scientific-modeling software tool to facilitate the development of useful scientific models. The prototype is developed for applications to planetary modeling, and specific examples are given that relate to the atmosphere of Titan. The scientific modeling tool employs a high-level domain-specific modeling language, several data-display facilities, and a library of experimental datasets and scientific equations. The planetary modeling prototype links uncomputed physical variables to computed variables with computational transformations based on a backchaining procedure. The system - implemented in LISP with an object-oriented knowledge-representation tool - is run on a workstation that provides interface with several models. The prototype is expected to form the basis for a sophisticated modeling tool that can permit active experimentation.

  15. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  16. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  17. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  18. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  19. Evaluation of free non-diagnostic DICOM software tools

    NASA Astrophysics Data System (ADS)

    Liao, Wei; Deserno, Thomas M.; Spitzer, Klaus

    2008-03-01

    A variety of software exists to interpret files or directories compliant to the Digital Imaging and Communications in Medicine (DICOM) standard and display them as individual images or volume rendered objects. Some of them offer further processing and analysis features. The surveys that have been published so far are partly not up-to-date anymore, and neither a detailed description of the software functions nor a comprehensive comparison is given. This paper aims at evaluation and comparison of freely available, non-diagnostic DICOM software with respect to the following aspects: (i) data import; (ii) data export; (iii) header viewing; (iv) 2D image viewing; (v) 3D volume viewing; (vi) support; (vii) portability; (viii) workability; and (ix) usability. In total, 21 tools were included: 3D Slicer, AMIDE, BioImage Suite, DicomWorks, EViewBox, ezDICOM, FPImage, ImageJ, JiveX, Julius, MedImaView, MedINRIA, MicroView, MIPAV, MRIcron, Osiris, PMSDView, Syngo FastView, TomoVision, UniViewer, and XMedCon. Our results in table form can ease the selection of appropriate DICOM software tools. In particular, we discuss use cases for the inexperienced user, data conversion, and volume rendering, and suggest Syngo FastView or PMSDView, DicomWorks or XMedCon, and ImageJ or UniViewer, respectively.

  20. Westinghouse Waste Simulation and Optimization Software Tool - 13493

    SciTech Connect

    Mennicken, Kim; Aign, Joerg

    2013-07-01

    Radioactive waste is produced during NPP operation and NPP D and D. Different kinds of waste with different volumes and properties have to be treated. Finding a technically and commercially optimized waste treatment concept is a difficult and time consuming process. The Westinghouse waste simulation and optimization software tool is an approach to study the total life cycle cost of any waste management facility. The tool enables the user of the simulation and optimization software to plan processes and storage buildings and to identify bottlenecks in the overall waste management design before starting detailed planning activities. Furthermore, application of the software enables the user to optimize the number of treatment systems, to determine the minimum design capacity for onsite storage facilities, to identify bottlenecks in the overall design and to identify the most cost-effective treatment paths by maintaining optimal waste treatment technologies. In combination with proven waste treatment equipment and integrated waste management solutions, the waste simulation and optimization software provides reliable qualitative results that lead to an effective planning and minimization of the total project planning risk of any waste management activity. (authors)

  1. Software Certification for Temporal Properties With Affordable Tool Qualification

    NASA Technical Reports Server (NTRS)

    Xia, Songtao; DiVito, Benedetto L.

    2005-01-01

    It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.

  2. fMRI analysis software tools: an evaluation framework

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Colli, Vittoria; Strocchi, Sabina; Vite, Cristina; Binaghi, Elisabetta; Conte, Leopoldo

    2011-03-01

    Performance comparison of functional Magnetic Resonance Imaging (fMRI) software tools is a very difficult task. In this paper, a framework for comparison of fMRI analysis results obtained with different software packages is proposed. An objective evaluation is possible only after pre-processing steps that normalize input data in a standard domain. Segmentation and registration algorithms are implemented in order to classify voxels belonging to brain or not, and to find the non rigid transformation that best aligns the volume under inspection with a standard one. Through the definitions of intersection and union of fuzzy logic an index was defined which quantify information overlap between Statistical Parametrical Maps (SPMs). Direct comparison between fMRI results can only highlight differences. In order to assess the best result, an index that represents the goodness of the activation detection is required. The transformation of the activation map in a standard domain allows the use of a functional Atlas for labeling the active voxels. For each functional area the Activation Weighted Index (AWI) that identifies the mean activation level of whole area was defined. By means of this brief, but comprehensive description, it is easy to find a metric for the objective evaluation of a fMRI analysis tools. Trough the first evaluation method the situations where the SPMs are inconsistent were identified. The result of AWI analysis suggest which tool has higher sensitivity and specificity. The proposed method seems a valid evaluation tool when applied to an adequate number of patients.

  3. Classroom Live: a software-assisted gamification tool

    NASA Astrophysics Data System (ADS)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  4. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  5. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  6. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    PubMed

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  7. An Approach to Building a Traceability Tool for Software Development

    NASA Technical Reports Server (NTRS)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional

  8. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  9. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  10. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  11. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    PubMed

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms.

  12. Using Blogging Software to Provide Additional Writing Instruction

    ERIC Educational Resources Information Center

    Carver, Lin B.; Todd, Carol

    2016-01-01

    Classroom teachers sometimes struggle trying to find time during the typical school day to provide the writing instruction students need to be successful. This study examined 29 fifth through twelfth grade classroom teachers' survey responses about their perception of the effectiveness of using an online blogging tool, Kidblog, to plan and provide…

  13. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  14. SPIRou @ CFHT: data reduction software and simulation tools

    NASA Astrophysics Data System (ADS)

    Artigau, Étienne; Bouchy, François; Delfosse, Xavier; Bonfils, Xavier; Donati, Jean-François; Figueira, Pedro; Thanjavur, Karun; Lafrenière, David; Doyon, René; Surace, Christian; Moutou, Claire; Boisse, Isabelle; Saddlemyer, Leslie; Loop, David; Kouach, Driss; Pepe, Francesco; Lovis, Christophe; Hernandez, Olivier; Wang, Shiang-Yu

    2012-09-01

    SPIRou is a near-infrared, echelle spectropolarimeter/velocimeter under design for the 3.6m Canada-France- Hawaii Telescope (CFHT) on Mauna Kea, Hawaii. The unique scientific capabilities and technical design features are described in the accompanying papers at this conference. In this paper we focus on the data reduction software (DRS) and the data simulation tool. The SPIRou DRS builds upon the experience of the existing SOPHIE, HARPS and ESPADONS spectrographs; class-leaders instruments for high-precision RV measurements and spectropolarimetry. While SPIRou shares many characteristics with these instruments, moving to the near- infrared domain brings specific data-processing challenges: the presence of a large number of telluric absorption lines, strong emission sky lines, thermal background, science arrays with poorer cosmetics, etc. In order for the DRS to be fully functional for SPIRou's first light in 2015, we developed a data simulation tool that incorporates numerous instrumental and observational e_ects. We present an overview of the DRS and the simulation tool architectures.

  15. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  16. User Guide for the STAYSL PNNL Suite of Software Tools

    SciTech Connect

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  17. Utilizing Spectroscopic Research Tools and Software in the Classroom

    NASA Astrophysics Data System (ADS)

    Grubbs, G. S., II

    2015-06-01

    Given today's technological age, it has become crucial to be able to reach the student in a more ''tech-savvy" way than traditional classroom methods afford. Given this, there are already a vast range of software packages available to the molecular spectroscopist that can easily be introduced to the classroom with success. This talk will highlight taking a few of these tools (Gaussian09, SPFIT/SPCAT, the AABS Package, LabViewTM, etc.) and implementing them in the classroom to teach subjects such as Quantum Mechanics and Thermodynamics as well as to aid in the linkage between these subjects. Examples of project implementation on both undergraduate and graduate level students will be presented with a discussion on the successes and failures of such attempts.

  18. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  19. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    SciTech Connect

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  20. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    DTIC Science & Technology

    1991-04-01

    5 1.101 Reliability Prediction Tools ......................... 6 1.102 Reliability Modeling Tools ........................... 14 1.103...M/S/T TOOL SUMMARIES 1.1 RELIABILITY TOOLS 1.101 Reliability Prediction Tools 1.102 Reliability Modeling Tools 1.103 Fault Tree Analysis Tools 1.104...Thermal Analysis Tools 1.115 Structural Reliability Evaluation Tools 5 1.101 RELIABILITY PREDICTION TOOLS NAME: ARM - Advanced Reliability Modeling

  1. A software tool for rapid flood inundation mapping

    USGS Publications Warehouse

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  2. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  3. Software Development Outsourcing Decision Support Tool with Neural Network Learning

    DTIC Science & Technology

    2004-03-01

    software domain, enterprise scripting software domain, and outsourcing ( maintenance and training) processes found to be included in the new model but not in...accounting and order entry) software domains, and outsourcing ( maintenance , configuration management and software engineer support) processes were...found in the original model but not in the new model included: enterprise (scripting and order entry) software domains and outsourcing maintenance process

  4. Survivability as a Tool for Evaluating Open Source Software

    DTIC Science & Technology

    2015-06-01

    mistaken to mean “ free ” software . It may be true in some instances that open source software is offered free -of-charge, but there are other instances...upgrades, and maintenance [11]. Cost is always a big driver of software selection, and it should be considered even when using OSS acquired free of charge...source projects during their free time, and may build software because the commercial-off-the-shelf (COTS) software already in existence does not offer

  5. IVUSAngio tool: a publicly available software for fast and accurate 3D reconstruction of coronary arteries.

    PubMed

    Doulaverakis, Charalampos; Tsampoulatidis, Ioannis; Antoniadis, Antonios P; Chatzizisis, Yiannis S; Giannopoulos, Andreas; Kompatsiaris, Ioannis; Giannoglou, George D

    2013-11-01

    There is an ongoing research and clinical interest in the development of reliable and easily accessible software for the 3D reconstruction of coronary arteries. In this work, we present the architecture and validation of IVUSAngio Tool, an application which performs fast and accurate 3D reconstruction of the coronary arteries by using intravascular ultrasound (IVUS) and biplane angiography data. The 3D reconstruction is based on the fusion of the detected arterial boundaries in IVUS images with the 3D IVUS catheter path derived from the biplane angiography. The IVUSAngio Tool suite integrates all the intermediate processing and computational steps and provides a user-friendly interface. It also offers additional functionality, such as automatic selection of the end-diastolic IVUS images, semi-automatic and automatic IVUS segmentation, vascular morphometric measurements, graphical visualization of the 3D model and export in a format compatible with other computer-aided design applications. Our software was applied and validated in 31 human coronary arteries yielding quite promising results. Collectively, the use of IVUSAngio Tool significantly reduces the total processing time for 3D coronary reconstruction. IVUSAngio Tool is distributed as free software, publicly available to download and use.

  6. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  7. Decision graphs: a tool for developing real-time software

    SciTech Connect

    Kozubal, A.J.

    1981-01-01

    The use of decision graphs in the preparation of, in particular, real-time software is briefly described. The usefulness of decision graphs in software design, testing, and maintenance is pointed out. 2 figures. (RWR)

  8. Teaching Undergraduate Software Engineering Using Open Source Development Tools

    DTIC Science & Technology

    2012-01-01

    on Computer Science Education (SIGCSE 󈧏), 153- 158. Pandey, R. (2009). Exploiting web resources for teaching /learning best software design tips...Issues in Informing Science and Information Technology Volume 9, 2012 Teaching Undergraduate Software Engineering Using Open Source Development...multi-course sequence, to teach students both the theoretical concepts of soft- ware development as well as the practical aspects of developing software

  9. Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes

    SciTech Connect

    Post, Brian K; Nuttall, David; Cukier, Michael; Hile, Michael

    2016-07-29

    The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Molds are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.

  10. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, G.; Ferguson, B. L.

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, Task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results, and were generated using a Yates analysis. A previous study has indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledge base and developing a procedure to extract rules to aid in TBC design.

  11. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, Gregory; Ferguson, B. Lynn

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results and were generated using a Yates analysis. A previous study had indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledgebase and developing a procedure to extract rules to aid in TBC design.

  12. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  13. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    NASA Astrophysics Data System (ADS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-02-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students.

  14. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY...This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the...interoperability and enhanced communication. 15. NUMBER OF PAGES 271 14. SUBJECT TERMS Software Engineering, Computer Science, Management

  15. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    PubMed

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth.

  16. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  17. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  18. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  19. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  20. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  1. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  2. Emerging role of bioinformatics tools and software in evolution of clinical research

    PubMed Central

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  3. Emerging role of bioinformatics tools and software in evolution of clinical research.

    PubMed

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  4. Pvarray: A software tool for photovoltaic array design

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1985-01-01

    The application of PVARRAY, a software program for design of photovoltaic arrays are described. Results of sample parametric studies on array configurations are presented. It is concluded that PVARRAY could simulate a variety of configurations.

  5. OpenROCS: a software tool to control robotic observatories

    NASA Astrophysics Data System (ADS)

    Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere

    2012-09-01

    We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).

  6. Klonos: A Similarity Analysis Based Tool for Software Porting

    SciTech Connect

    and Oscar Hernandez, Wei Ding

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  7. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  8. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  9. CUSTOMER RESPONSE TO BESTPRACTICES TRAINING AND SOFTWARE TOOLS PROVIDED BY DOE'S INDUSTRIAL TECHNOLOGIES PROGRAM

    SciTech Connect

    Schweitzer, Martin; Martin, Michaela A; Schmoyer, Richard L

    2008-03-01

    The BestPractices program area, which has evolved into the Save Energy Now (SEN) Initiative, is a component of the U.S. Department of Energy's (DOE's) Industrial Technologies Program (ITP) that provides technical assistance and disseminates information on energy-efficient technologies and practices to U.S. industrial firms. The BestPractices approach to information dissemination includes conducting training sessions which address energy-intensive systems (compressed air, steam, process heat, pumps, motors, and fans) and distributing DOE software tools on those same topics. The current report documents a recent Oak Ridge National Laboratory (ORNL) study undertaken to determine the implementation rate, attribution rate, and reduction factor for industrial end-users who received BestPractices training and registered software in FY 2006. The implementation rate is the proportion of service recipients taking energy-saving actions as a result of the service received. The attribution rate applies to those individuals taking energy-saving actions as a result of the services received and represents the portion of the savings achieved through those actions that is due to the service. The reduction factor is the saving that is realized from program-induced measures as a proportion of the potential savings that could be achieved if all service recipients took action. In addition to examining those factors, the ORNL study collected information on selected characteristics of service recipients, the perceived value of the services provided, and the potential energy savings that can be achieved through implementation of measures identified from the training or software. Because the provision of training is distinctly different from the provision of software tools, the two efforts were examined independently and the findings for each are reported separately.

  10. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  11. Software Validation, Verification, and Testing Technique and Tool Reference Guide. Final Report.

    ERIC Educational Resources Information Center

    Powell, Patricia B., Ed.

    Intended as an aid in the selection of software techniques and tools, this document contains three sections: (1) a suggested methodology for the selection of validation, verification, and testing (VVT) techniques and tools; (2) summary matrices by development phase usage, a table of techniques and tools with associated keywords, and an…

  12. adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.

    2004-01-01

    A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.

  13. Experiments in Chemistry: A Model Science Software Tool.

    ERIC Educational Resources Information Center

    Malone, Diana; Tinker, Robert

    1984-01-01

    Describes "Experiments in Chemistry," in which experiments are performed using software and hardware interfaced to the Apple microcomputer's game paddle port. Experiments include temperature, pH electrode, and EMF (cell potential determinations, oxidation-reduction titrations, and precipitation titrations) investigations. (JN)

  14. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  15. A Study of Collaborative Software Development Using Groupware Tools

    ERIC Educational Resources Information Center

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  16. Orbit Analysis Tools Software (Version 1.0) User’s Manual

    DTIC Science & Technology

    1993-04-15

    Naval Research Laboratory Washington. DC 20375-5320 AD-A265 012 NRL/MR/8103--93-73071111111111111 ilIl I! f111t l11,!If Orbit Analysis Tools Software ...DATES COVERED April 13, 1993 4. TITLE AND SUBTITLE 6. FUNDING NUMBERS Orbit Analysis Tools Software (Version 1.0) Users Manual 6. AUTHOR(S) Alan S. Hope...fhbxiWn 200 word) A program to perform satellite mission and coverage analysis has been written. The Orbit Analysis Tools Software (OATS) program uses

  17. Use of software tools for calculating flow accelerated corrosion of nuclear power plant equipment and pipelines

    NASA Astrophysics Data System (ADS)

    Naftal', M. M.; Baranenko, V. I.; Gulina, O. M.

    2014-06-01

    The results obtained from calculations of flow accelerated corrosion of equipment and pipelines operating at nuclear power plants constructed on the basis of PWR, VVER, and RBMK reactors carried out using the EKI-02 and EKI-03 software tools are presented. It is shown that the calculation error does not exceed its value indicated in the qualification certificates for these software tools. It is pointed out that calculations aimed at predicting the service life of pipelines and efficient surveillance of flow accelerated corrosion wear are hardly possible without using the above-mentioned software tools.

  18. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    SciTech Connect

    Habib, Salman; Roser, Robert

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  19. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  20. Methods, Software and Tools for Three Numerical Applications. Final report

    SciTech Connect

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  1. An Examination of Selected Software Testing Tools: 1992

    DTIC Science & Technology

    1992-12-01

    conducted to determine their process- 4 Problem reporting & Analysis related causes? Is a mechanism used for error cause analysis? 5 -- Is software...productivity analyzed for major process steps?- 4 Progress monitoring Is there a mechanism for assuring that regression testing is 2 -- routinely...performed? Process control Is there a mechanism for ensuring the adequacy of regression 3 Change analysis. testing? coverage analysis Are formal test case

  2. A software tool for 3D dose verification and analysis

    NASA Astrophysics Data System (ADS)

    Sa'd, M. Al; Graham, J.; Liney, G. P.

    2013-06-01

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  3. An experiment in software reliability: Additional analyses using data from automated replications

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Lauterbach, Linda A.

    1988-01-01

    A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.

  4. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    SciTech Connect

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  5. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    EPA Science Inventory

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  6. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    ERIC Educational Resources Information Center

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  7. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  8. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  9. Managing clinical research data: software tools for hypothesis exploration.

    PubMed

    Starmer, C F; Dietz, M A

    1990-07-01

    Data representation, data file specification, and the communication of data between software systems are playing increasingly important roles in clinical data management. This paper describes the concept of a self-documenting file that contains annotations or comments that aid visual inspection of the data file. We describe access of data from annotated files and illustrate data analysis with a few examples derived from the UNIX operating environment. Use of annotated files provides the investigator with both a useful representation of the primary data and a repository of comments that describe some of the context surrounding data capture.

  10. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  11. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  12. lipID--a software tool for automated assignment of lipids in mass spectra.

    PubMed

    Hübner, Göran; Crone, Catharina; Lindner, Buko

    2009-12-01

    A new software tool called lipID is reported, which supports the identification of glycerophospholipids, glycosphingolipids, fatty acids and small oligosaccharides in mass spectra. The user-extendable software is a Microsoft (MS) Excel Add-In developed using Visual Basic for Applications and is compatible with all Versions of MS Excel since MS Excel 97. It processes singly given mass-to-charge values as well as mass lists considering a number of user-defined options. The software's mode of operation, usage and options are explained and the benefits and limitations of the tool are illustrated by means of three typical analytical examples of lipid analyses.

  13. The Web Interface Template System (WITS), a software developer`s tool

    SciTech Connect

    Lauer, L.J.; Lynam, M.; Muniz, T.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  14. An internet-based software tool for submitting crime information to forensic laboratories

    NASA Astrophysics Data System (ADS)

    Ahluwalia, Rashpal S.; Govindarajulu, Sriram

    2004-11-01

    This paper describes an internet-based software tool developed for the West Virginia State Police Forensics Laboratory. The software enables law enforcement agents to submit crime information to the Forensic Laboratory via a secure Internet connection. Online electronic forms were created to mirror the existing paper based forms, making the transition easier. The process of submitting case information was standardized and streamlined, there by minimizing information inconsistency. The crime information once gathered is automatically stored in a database, and can be viewed and queried by any authorized law enforcement officers. The software tool will be deployed in all counties of WV.

  15. Multiscale Software Tool for Controls Prototyping in Supersonic Combustors

    DTIC Science & Technology

    2004-04-01

    such systems for prototyping and design optimization becomes a formidable task. Present-day computational fluid dynamics (CFD) tools have found...are the activation function and the synaptic weights. The activation function is typically a sigmoid, or for a greater dynamic range, a hyperbolic...built-in capability to adapt their synaptic weights to changes in the surrounding environment. A neural network trained in a specific environment can

  16. Software tools for developing an acoustics multimedia CD-ROM

    NASA Astrophysics Data System (ADS)

    Bigelow, Todd W.; Wheeler, Paul A.

    2003-10-01

    A multimedia CD-ROM was developed to accompany the textbook, Science of Sound, by Tom Rossing. This paper discusses the multimedia elements included in the CD-ROM and the various software packages used to create them. PowerPoint presentations with an audio-track background were converted to web pages using Impatica. Animations of acoustic examples and quizzes were developed using Flash by Macromedia. Vegas Video and Sound Forge by Sonic Foundry were used for editing video and audio clips while Cleaner by Discreet was used to compress the clips for use over the internet. Math tutorials were presented as whiteboard presentations using Hitachis Starboard to create the graphics and TechSmiths Camtasia Studio to record the presentations. The CD-ROM is in a web-page format created with Macromedias Dreamweaver. All of these elements are integrated into a single course supplement that can be viewed by any computer with a web browser.

  17. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  18. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  19. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  20. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  1. Development of a software tool for an internal dosimetry using MIRD method

    NASA Astrophysics Data System (ADS)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  2. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  3. RixsToolBox: software for the analysis of soft X-ray RIXS data acquired with 2D detectors.

    PubMed

    Kummer, K; Tamborrino, A; Amorese, A; Minola, M; Braicovich, L; Brookes, N B; Ghiringhelli, G

    2017-03-01

    A software with a graphical user interface has been developed with the aim of facilitating the data analysis for users of a new resonant inelastic X-ray scattering (RIXS) spectrometer installed at the ESRF beamline ID32. The software is organized in modules covering all relevant steps in the data reduction from a stack of several hundred two-dimensional CCD images to a single RIXS spectrum. It utilizes both full charge integration and single-photon centroiding to cope with high-flux and high-resolution requirements. Additional modules for further data analysis and the extraction of instrumental parameters are available. The software has been in routine use for about a year now and in that time many additional features have been incorporated. It now meets the users' need for an easy-to-use data analysis tool that allows looking at and understanding data as it is acquired and thus steering users' experiments more efficiently.

  4. A diagnostic tool for malaria based on computer software.

    PubMed

    Kotepui, Manas; Uthaisar, Kwuntida; Phunphuech, Bhukdee; Phiwklam, Nuoil

    2015-11-12

    Nowadays, the gold standard method for malaria diagnosis is a staining of thick and thin blood film examined by expert laboratorists. It requires well-trained laboratorists, which is a time consuming task, and is un-automated protocol. For this study, Maladiag Software was developed to predict malaria infection in suspected malaria patients. The demographic data of patients, examination for malaria parasites, and complete blood count (CBC) profiles were analyzed. Binary logistic regression was used to create the equation for the malaria diagnosis. The diagnostic parameters of the equation were tested on 4,985 samples (703 infected and 4,282 control samples). The equation indicated 81.2% sensitivity and 80.3% specificity for predicting infection of malaria. The positive likelihood and negative likelihood ratio were 4.12 (95% CI = 4.01-4.23) and 0.23 (95% CI = 0.22-0.25), respectively. This parameter also had odds ratios (P value < 0.0001, OR = 17.6, 95% CI = 16.0-19.3). The equation can predict malaria infection after adjust for age, gender, nationality, monocyte (%), platelet count, neutrophil (%), lymphocyte (%), and the RBC count of patients. The diagnostic accuracy was 0.877 (Area under curve, AUC) (95% CI = 0.871-0.883). The system, when used in combination with other clinical and microscopy methods, might improve malaria diagnoses and enhance prompt treatment.

  5. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  6. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    NASA Technical Reports Server (NTRS)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  7. Automated software development tools in the MIS (Management Information Systems) environment

    SciTech Connect

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  8. Development of Oceanographic Software Tools and Applications for Navy Operational Use

    DTIC Science & Technology

    1997-09-30

    DEVELOPMENT OF OCEANOGRAPHIC SOFTWARE TOOLS AND APPLICATIONS FOR NAVY OPERATIONAL USE James H. Corbin Center for Air Sea Technology Mississippi State...applications, were significantly reduced. Accordingly, the CAST objective for FY97 was to develop interactive graphical tools for shipboard METOC briefers...This was in response to a COMSIXTHFLT validated METOC requirement to provide visualization briefing tools , animations, and 3–D graphical depictions

  9. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  10. Software tools and frameworks in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Brun, R.

    2011-01-01

    In many fields of science and industry the computing environment has grown at an exponential speed in the past 30 years. From ad hoc solutions for each problem, the field has evolved gradually to use or reuse systems developed across the years for the same environment or coming from other fields with the same requirements. Several frameworks have emerged to solve common problems. In High Energy Physics (HEP) and Nuclear Physics, we have witnessed the emergence of common tools, packages and libraries that have become gradually the corner stone of the computing in these fields. The emergence of these systems has been complex because the computing field is evolving rapidly, the problems to be solved more and more complex and the size of the experiments now involving several thousand physicists from all over the world. This paper describes the emergence of these frameworks and their evolution from libraries including independent subroutines to task-oriented packages and to general experiments frameworks.

  11. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  12. Methods and software tools for design evaluation in population pharmacokinetics-pharmacodynamics studies.

    PubMed

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)-pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization.

  13. Dynamic Susceptibility Contrast-MRI Quantification Software Tool: Development and Evaluation

    PubMed Central

    Korfiatis, Panagiotis; Kline, Timothy L.; Kelm, Zachary S.; Carter, Rickey E.; Hu, Leland S.; Erickson, Bradley J.

    2016-01-01

    Relative cerebral blood volume (rCBV) is a magnetic resonance imaging biomarker that is used to differentiate progression from pseudoprogression in patients with glioblastoma multiforme, the most common primary brain tumor. However, calculated rCBV depends considerably on the software used. Automating all steps required for rCBV calculation is important, as user interaction can lead to increased variability and possible inaccuracies in clinical decision-making. Here, we present an automated tool for computing rCBV from dynamic susceptibility contrast-magnetic resonance imaging that includes leakage correction. The entrance and exit bolus time points are automatically calculated using wavelet-based detection. The proposed tool is compared with 3 Food and Drug Administration-approved software packages, 1 automatic and 2 requiring user interaction, on a data set of 43 patients. We also evaluate manual and automated white matter (WM) selection for normalization of the cerebral blood volume maps. Our system showed good agreement with 2 of the 3 software packages. The intraclass correlation coefficient for all comparisons between the same software operated by different people was >0.880, except for FuncTool when operated by user 1 versus user 2. Little variability in agreement between software tools was observed when using different WM selection techniques. Our algorithm for automatic rCBV calculation with leakage correction and automated WM selection agrees well with 2 out of the 3 FDA-approved software packages. PMID:28066810

  14. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    PubMed

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development.

  15. Tutorial for the software tools mail system - MSG

    SciTech Connect

    Sventek, V.A.

    1984-10-01

    This manual was written with two purposes in mind: (1) to get started using the system quickly; and (2) to explain the more advanced features of the system needed when the mail load gets heavier and the time to spend dealing with it gets shorter. The first section shows you how to use the simplest form of each command, giving just enough information to use the mail system comfortably and still be productive. The second section defines additional terms and describes the more technical aspects of the system, and should be used as a reference guide.

  16. Establishing a Methodology for Evaluation and Selecting Computer Aided Software Engineering Tools for a Defined Software Engineering Environment at the Air Force Institute of Technology School of Engineering

    DTIC Science & Technology

    1991-12-01

    F. Lecouat, and V. Ambriola. "A Tool to Coordinate Tools," IEEE Software: 17-25 (November 1988). 6. Bruce , T. A., J. Fuller, and T. Moriarty, "So You...34 Journal of Systems Management, 40-5: 29-32 (May 1989). BIB.1 14. Dart, S. A., R. J. Ellison, P. H. Feiler , and A. N. Habermann, "Software

  17. PROBEmer: a web-based software tool for selecting optimal DNA oligos

    PubMed Central

    Emrich, Scott J.; Lowe, Mary; Delcher, Arthur L.

    2003-01-01

    PROBEmer (http://probemer.cs.loyola.edu) is a web-based software tool that enables a researcher to select optimal oligos for PCR applications and multiplex detection platforms including oligonucleotide microarrays and bead-based arrays. Given two groups of nucleic-acid sequences, a target group and a non-target group, the software identifies oligo sequences that occur in members of the target group, but not in the non-target group. To help predict potential cross hybridization, PROBEmer computes all near neighbors in the non-target group and displays their alignments. The software has been used to obtain genus-specific prokaryotic probes based on the 16S rRNA gene, gene-specific probes for expression analyses and PCR primers. In this paper, we describe how to use PROBEmer, the computational methods it employs, and experimental results for oligos identified by this software tool. PMID:12824409

  18. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    PubMed

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization.

  19. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  20. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  1. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  2. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.

    PubMed

    Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A

    2016-09-01

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples.

  3. Proceedings of the Workshop on software tools for distributed intelligent control systems

    SciTech Connect

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  4. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface.

  5. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    SciTech Connect

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A.

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  6. Management of an affiliated Physics Residency Program using a commercial software tool.

    PubMed

    Zacarias, Albert S; Mills, Michael D

    2010-06-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.

  7. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  8. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    PubMed Central

    2011-01-01

    Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, Graph

  9. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  10. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    SciTech Connect

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  11. DairyGEM: A software tool for assessing emissions and mitigation strategies for dairy production systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many gaseous compounds are emitted from dairy farms. Those of current interest include the toxic compounds of ammonia and hydrogen sulfide and the greenhouse gases of methane, nitrous oxide and carbon dioxide. A relatively easy to use software tool was developed that predicts these emissions through...

  12. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  13. DairyGEM: a software tool for whole farm assessment of emission mitigation strategies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate assessment of the impact of management on agricultural emissions requires consideration of many farm components and their interactions. A comprehensive assessment is needed because changes made to reduce one emission type or source may increase another. A new software tool was developed tha...

  14. Understanding Collaborative Learning: Small Group Work on Contextual Problems Using a Multi-Representational Software Tool.

    ERIC Educational Resources Information Center

    Smith, Erick; Confrey, Jere

    The interactions of three high school juniors (two females and one male) working together on a series of contextual mathematics problems using a multirepresentational software tool were studied. Focus was on determining how a constructivist model of learning, based on an individual problematic-action-reflection model, can be extended to offer…

  15. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  16. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    PubMed

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow.

  17. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  18. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    ERIC Educational Resources Information Center

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  19. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    PubMed

    Kong, Xiaoquan; Huang, Minyi; Duan, Renyan

    2015-01-01

    It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs). There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata) to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF) and check species names and the accuracy of coordinates (latitude and longitude). It is an open source software (GNU Affero General Public License/AGPL licensed) allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  20. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  1. sigTOOL: A MATLAB-based environment for sharing laboratory-developed software to analyze biological signals.

    PubMed

    Lidierth, Malcolm

    2009-03-30

    This paper describes a software package, named sigTOOL, for processing biological signals. The package runs in the MATLAB programming environment and has been designed to promote the sharing of laboratory-developed software across the worldwide web. As proof-of-concept of the design of the system, sigTOOL has been used to build an analysis application for dealing with neuroscience data complete with a user-friendly graphical user interface which implements a range of waveform and spike-train analysis functions. The interface allows many commonly used neuroscience data file formats to be loaded (including those of Alpha Omega, Cambridge Electronic Design, Cyberkinetics Inc., Molecular Devices, Nex Technologies and Plexon Instruments). Waveform analysis functions selectable from the interface support waveform averaging (mean and median), auto- and cross-correlation, power spectral analysis, coherence estimation, digital filtering (feedback and feedforward) and resampling. Spike-train analyses include interspike interval distributions, Poincaré plots, event auto- and cross-correlations, spike-triggered averaging, stimulus driven and phase-related peri-event time histograms and rasters as well as frequencygrams. User-developed additions to sigTOOL that are archived and distributed electronically will be added to the sigTOOL interface on-the-fly, without the need to modify the core sigTOOL code. Full sigTOOL functionality will be provided to support the user-developed code, including the ability to record a user action history for batch processing of files and support for exporting the results of analyses to external graphics editing software and spreadsheet-based data processing packages.

  2. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    SciTech Connect

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  3. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  4. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    PubMed

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  5. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  6. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    PubMed

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  7. A proposal for reverse engineering CASE tools to support new software development

    SciTech Connect

    Maxted, A.

    1993-06-01

    Current CASE technology provides sophisticated diagramming tools to generate a software design. The design, stored internal to the CASE tool, is bridged to the code via code generators. There are several limitations to this technique: (1) the portability of the design is limited to the portability of the CASE tools, and (2) the code generators offer a clumsy link between design and code. The CASE tool though valuable during design, becomes a hindrance during implementation. Frustration frequently causes the CASE tool to be abandoned during implementation, permanently severing the link between design and code. Current CASE stores the design in a CASE internal structure, from which code is generated. The technique presented herein suggests that CASE tools store the system knowledge directly in code. The CASE support then switches from an emphasis on code generators to employing state-of-the-art reverse engineering techniques for document generation. Graphical and textual descriptions of each software component (e.g., Ada Package) may be generated via reverse engineering techniques from the code. These reverse engineered descriptions can be merged with system over-view diagrams to form a top-level design document. The resulting document can readily reflect changes to the software components by automatically generating new component descriptions for the changed components. The proposed auto documentation technique facilitates the document upgrade task at later stages of development, (e.g., design, implementation and delivery) by using the component code as the source of the component descriptions. The CASE technique presented herein is a unique application of reverse engineering techniques to new software systems. This technique contrasts with more traditional CASE auto code generation techniques.

  8. ConsensusCluster: a software tool for unsupervised cluster discovery in numerical data.

    PubMed

    Seiler, Michael; Huang, C Chris; Szalma, Sandor; Bhanot, Gyan

    2010-02-01

    We have created a stand-alone software tool, ConsensusCluster, for the analysis of high-dimensional single nucleotide polymorphism (SNP) and gene expression microarray data. Our software implements the consensus clustering algorithm and principal component analysis to stratify the data into a given number of robust clusters. The robustness is achieved by combining clustering results from data and sample resampling as well as by averaging over various algorithms and parameter settings to achieve accurate, stable clustering results. We have implemented several different clustering algorithms in the software, including K-Means, Partition Around Medoids, Self-Organizing Map, and Hierarchical clustering methods. After clustering the data, ConsensusCluster generates a consensus matrix heatmap to give a useful visual representation of cluster membership, and automatically generates a log of selected features that distinguish each pair of clusters. ConsensusCluster gives more robust and more reliable clusters than common software packages and, therefore, is a powerful unsupervised learning tool that finds hidden patterns in data that might shed light on its biological interpretation. This software is free and available from http://code.google.com/p/consensus-cluster .

  9. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    PubMed

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  10. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  11. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    NASA Astrophysics Data System (ADS)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  12. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    NASA Astrophysics Data System (ADS)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  13. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  14. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    PubMed

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240).

  15. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  16. RAVEN as a tool for dynamic probabilistic risk assessment: Software overview

    SciTech Connect

    Alfonsi, A.; Rabiti, C.; Mandelli, D.; Cogliati, J. J.; Kinoshita, R. A.

    2013-07-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/ monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities. (authors)

  17. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    SciTech Connect

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  18. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    SciTech Connect

    Turner, Dennis L.

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  19. A software tool to aid budget planning for long-term care at local authority level.

    PubMed

    Xie, Haifeng; Chaussalet, Thierry; Toffa, Sam; Crowther, Peter

    2005-01-01

    In this paper, we present a software tool that implements a novel modelling framework developed by the authors to provide useful information to budget planners for long-term care at local authority level. By combining unit costs of care with an underlying survival model for publicly funded residents in long-term care, the software tool is able to provide forecasts on the cost of maintaining the group of elderly who are currently in long-term care (referred to as known commitments) for a period of time. User interacts with the tool via a friendly graphical interface that guides them through a set of screens of options in a familiar wizard fashion. This tool was created and tested in collaboration with an English borough. Feedbacks from the care planner and manager show that the tool helps them gain better understanding on the behaviour of length-of-stay of residents under their care, and provides quantitative inputs into their decision making on budget planning for long-term care.

  20. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    SciTech Connect

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  1. A Structural Health Monitoring Software Tool for Optimization, Diagnostics and Prognostics

    DTIC Science & Technology

    2011-01-01

    A Structural Health Monitoring Software Tool for Optimization, Diagnostics and Prognostics Seth S . Kessler1, Eric B. Flynn2, Christopher T...technology more accessible, and commercially practical. 1. INTRODUCTION Currently successful laboratory non- destructive testing and monitoring...PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES

  2. A Multiscale Software Tool for Field/Circuit Co-Simulation

    DTIC Science & Technology

    2011-12-15

    Lumped Port 2 on the right end of the microstrip line. The simulated S-parameters, S11 and S21, of the active microwave amplifier circuit are shown in...REPORT A Multiscale Software Tool for Field/ Circuit Simulation Final Report 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: This report is developed under...topic #A08-T004, contract W911NF-09-C-0159. As the final report, we have developed a new multiscale field/ circuit solver by combining three efficient

  3. Tools to aid the specification and design of flight software, appendix B

    NASA Technical Reports Server (NTRS)

    Bristow, G.

    1980-01-01

    The tasks that are normally performed during the specification and architecture design stages of software development are identified. Ways that tools could perform, or aid the performance, of such tasks are also identified. Much of the verification and analysis that is suggested is currently rarely performed during these early stages, but it is believed that this analysis should be done as early as possible so as to detect errors as early as possible.

  4. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  5. A software tool for determination of breast cancer treatment methods using data mining approach.

    PubMed

    Cakır, Abdülkadir; Demirel, Burçin

    2011-12-01

    In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.

  6. The impact of layer thickness on the performance of additively manufactured lapping tools

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2015-10-01

    Lower cost additive manufacturing (AM) machines which have emerged in recent years are capable of producing tools, jigs, and fixtures that are useful in optical fabrication. In particular, AM tooling has been shown to be useful in lapping glass workpieces. Various AM machines are distinguished by the processes, materials, build times, and build resolution they provide. This research investigates the impact of varied build resolution (specifically layer resolution) on the lapping performance of tools built using the stereolithographic assembly (SLA) process in 50 μm and 100 μm layer thicknesses with a methacrylate photopolymer resin on a high resolution desktop printer. As with previous work, the lapping tools were shown to remove workpiece material during the lapping process, but the tools themselves also experienced significant wear on the order of 2-3 times the mass loss of the glass workpieces. The tool wear rates for the 100 μm and 50 μm layer tools were comparable, but the 50 μm layer tool was 74% more effective at removing material from the glass workpiece, which is attributed to some abrasive particles being trapped in the coarser surface of the 100 um layer tooling and not being available to interact with the glass workpiece. Considering the tool wear, these additively manufactured tools are most appropriate for prototype tooling where the low cost (<$45) and quick turnaround make them attractive when compared to a machined tool.

  7. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    PubMed

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  8. The anatomy of E-Learning tools: Does software usability influence learning outcomes?

    PubMed

    Van Nuland, Sonya E; Rogers, Kem A

    2016-07-08

    Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists.

  9. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  10. Teaching Locus with a Conserved Property by Integrating Mathematical Tools and Dynamic Geometric Software

    ERIC Educational Resources Information Center

    Stupel, Moshe; Segal, Ruti; Oxman, Victor

    2016-01-01

    In this article, we present investigative tasks that concern loci, which integrate the use of dynamic geometry software (DGS) with mathematics for proving the obtained figures. Additional conditions were added to the loci: ellipse, parabola and circle, which result in the emergence of new loci, similar in form to the original loci. The…

  11. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    PubMed

    Ma, Tianle; Zhang, Aidong

    2016-02-26

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  12. Software tools of the Computis European project to process mass spectrometry images.

    PubMed

    Robbe, Marie-France; Both, Jean-Pierre; Prideaux, Brendan; Klinkert, Ivo; Picaud, Vincent; Schramm, Thorsten; Hester, Atfons; Guevara, Victor; Stoeckli, Markus; Roempp, Andreas; Heeren, Ron M A; Spengler, Bernhard; Gala, Olivier; Haan, Serge

    2014-01-01

    Among the needs usually expressed by teams using mass spectrometry imaging, one that often arises is that for user-friendly software able to manage huge data volumes quickly and to provide efficient assistance for the interpretation of data. To answer this need, the Computis European project developed several complementary software tools to process mass spectrometry imaging data. Data Cube Explorer provides a simple spatial and spectral exploration for matrix-assisted laser desorption/ionisation-time of flight (MALDI-ToF) and time of flight-secondary-ion mass spectrometry (ToF-SIMS) data. SpectViewer offers visualisation functions, assistance to the interpretation of data, classification functionalities, peak list extraction to interrogate biological database and image overlay, and it can process data issued from MALDI-ToF, ToF-SIMS and desorption electrospray ionisation (DESI) equipment. EasyReg2D is able to register two images, in American Standard Code for Information Interchange (ASCII) format, issued from different technologies. The collaboration between the teams was hampered by the multiplicity of equipment and data formats, so the project also developed a common data format (imzML) to facilitate the exchange of experimental data and their interpretation by the different software tools. The BioMap platform for visualisation and exploration of MALDI-ToF and DESI images was adapted to parse imzML files, enabling its access to all project partners and, more globally, to a larger community of users. Considering the huge advantages brought by the imzML standard format, a specific editor (vBrowser) for imzML files and converters from proprietary formats to imzML were developed to enable the use of the imzML format by a broad scientific community. This initiative paves the way toward the development of a large panel of software tools able to process mass spectrometry imaging datasets in the future.

  13. The MORPH-R web server and software tool for predicting missing genes in biological pathways.

    PubMed

    Amar, David; Frades, Itziar; Diels, Tim; Zaltzman, David; Ghatan, Netanel; Hedley, Pete E; Alexandersson, Erik; Tzfadia, Oren; Shamir, Ron

    2015-09-01

    A biological pathway is the set of molecular entities involved in a given biological process and the interrelations among them. Even though biological pathways have been studied extensively, discovering missing genes in pathways remains a fundamental challenge. Here, we present an easy-to-use tool that allows users to run MORPH (MOdule-guided Ranking of candidate PatHway genes), an algorithm for revealing missing genes in biological pathways, and demonstrate its capabilities. MORPH supports the analysis in tomato, Arabidopsis and the two new species: rice and the newly sequenced potato genome. The new tool, called MORPH-R, is available both as a web server (at http://bioinformatics.psb.ugent.be/webtools/morph/) and as standalone software that can be used locally. In the standalone version, the user can apply the tool to new organisms using any proprietary and public data sources.

  14. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging

    PubMed Central

    Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data. PMID:27583365

  15. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    PubMed

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  16. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    SciTech Connect

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  17. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  18. Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1989

    1989-01-01

    Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)

  19. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  20. STRAP PTM: Software Tool for Rapid Annotation and Differential Comparison of Protein Post-Translational Modifications

    PubMed Central

    Spencer, Jean L.; Bhatia, Vivek N.; Whelan, Stephen A.; Costello, Catherine E.

    2014-01-01

    The identification of protein post-translational modifications (PTMs) is an increasingly important component of proteomics and biomarker discovery, but very few tools exist for performing fast and easy characterization of global PTM changes and differential comparison of PTMs across groups of data obtained from liquid chromatography-tandem mass spectrometry experiments. STRAP PTM (Software Tool for Rapid Annotation of Proteins: Post-Translational Modification edition) is a program that was developed to facilitate the characterization of PTMs using spectral counting and a novel scoring algorithm to accelerate the identification of differential PTMs from complex data sets. The software facilitates multi-sample comparison by collating, scoring, and ranking PTMs and by summarizing data visually. The freely available software (beta release) installs on a PC and processes data in protXML format obtained from files parsed through the Trans-Proteomic Pipeline. The easy-to-use interface allows examination of results at protein, peptide, and PTM levels, and the overall design offers tremendous flexibility that provides proteomics insight beyond simple assignment and counting. PMID:25422678

  1. Direct and adjoint sensitivity analysis of chemical kinetic systems with KPP: Part I—theory and software tools

    NASA Astrophysics Data System (ADS)

    Sandu, Adrian; Daescu, Dacian N.; Carmichael, Gregory R.

    The analysis of comprehensive chemical reactions mechanisms, parameter estimation techniques, and variational chemical data assimilation applications require the development of efficient sensitivity methods for chemical kinetics systems. The new release (KPP-1.2) of the kinetic preprocessor (KPP) contains software tools that facilitate direct and adjoint sensitivity analysis. The direct-decoupled method, built using BDF formulas, has been the method of choice for direct sensitivity studies. In this work, we extend the direct-decoupled approach to Rosenbrock stiff integration methods. The need for Jacobian derivatives prevented Rosenbrock methods to be used extensively in direct sensitivity calculations; however, the new automatic and symbolic differentiation technologies make the computation of these derivatives feasible. The direct-decoupled method is known to be efficient for computing the sensitivities of a large number of output parameters with respect to a small number of input parameters. The adjoint modeling is presented as an efficient tool to evaluate the sensitivity of a scalar response function with respect to the initial conditions and model parameters. In addition, sensitivity with respect to time-dependent model parameters may be obtained through a single backward integration of the adjoint model. KPP software may be used to completely generate the continuous and discrete adjoint models taking full advantage of the sparsity of the chemical mechanism. Flexible direct-decoupled and adjoint sensitivity code implementations are achieved with minimal user intervention. In a companion paper, we present an extensive set of numerical experiments that validate the KPP software tools for several direct/adjoint sensitivity applications, and demonstrate the efficiency of KPP-generated sensitivity code implementations.

  2. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this

  3. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  4. The SRS-Viewer: A Software Tool for Displaying and Evaluation of Pyroshock Data

    NASA Astrophysics Data System (ADS)

    Eberl, Stefan

    2014-06-01

    For the evaluation of the success of a pyroshock, the time domain and the corresponding Shock-Response- Spectra (SRS) have to be considered. The SRS-Viewer is an IABG developed software tool [1] to read data in Universal File format (*.unv) and either display or plot for each accelerometer the time domain, corresponding SRS and the specified Reference-SRS with tolerances in the background.The software calculates the "Average (AVG)", "Maximum (MAX)" and "Minimum (MIN)" SRS of any selection of accelerometers. A statistical analysis calculates the percentages of measured SRS above the specified Reference-SRS level and the percentage within the tolerance bands for comparison with the specified success criteria.Overlay plots of single accelerometers of different test runs enable to monitor the repeatability of the shock input and the integrity of the specimen. Furthermore the difference between the shock on a mass-dummy and the real test unit can be examined.

  5. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    PubMed Central

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    Motivation: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty. Results: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype likelihoods instead of genotypes and thereby takes the inherent uncertainty of the genotypes into account. Using both simulated and real data, we show that NgsRelate provides markedly better estimates for low-depth NGS data than two state-of-the-art genotype-based methods. Availability: NgsRelate is implemented in C++ and is available under the GNU license at www.popgen.dk/software. Contact: ida@binf.ku.dk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26323718

  6. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).

  7. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    PubMed

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture.

  8. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    PubMed

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.

  9. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    PubMed

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems.

  10. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  11. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  12. DASAO: software tool for the management of safeguards, waste and decommissioning

    SciTech Connect

    Noynaert, Luc; Verwaest, Isi; Libon, Henri; Cuchet, Jean-Marie

    2013-07-01

    Decommissioning of nuclear facilities is a complex process involving operations such as detailed surveys, decontamination and dismantling of equipment's, demolition of buildings and management of resulting waste and nuclear materials if any. This process takes place in a well-developed legal framework and is controlled and followed-up by stakeholders like the Safety Authority, the Radwaste management Agency and the Safeguards Organism. In the framework of its nuclear waste and decommissioning program and more specifically the decommissioning of the BR3 reactor, SCK-CEN has developed different software tools to secure the waste and material traceability, to support the sound management of the decommissioning project and to facilitate the control and the follow-up by the stakeholders. In the case of Belgium, it concerns the Federal Agency for Nuclear Control, the National Agency for radioactive waste management and fissile material and EURATOM and IAEA. In 2005, Belgonucleaire decided to shutdown her Dessel MOX fuel fabrication plant and the production stopped in 2006. According to the final decommissioning plan ('PDF') approved by NIRAS, the decommissioning works should start in 2008 at the earliest. In 2006, the management of Belgonucleaire identified the need for an integrated database and decided to entrust SCK-CEN with its development, because SCK-CEN relies on previous experience in comparable applications namely already approved by authorities such as NIRAS, FANC and EURATOM. The main objectives of this integrated software tool are: - simplified and updated safeguards; - waste and material traceability; - computerized documentation; - support to project management; - periodic and final reporting to waste and safety authorities. The software called DASAO (Database for Safeguards, Waste and Decommissioning) was successfully commissioned in 2008 and extensively used from 2009 to the satisfaction of Belgonucleaire and the stakeholders. SCK-CEN is now implementing

  13. A software tool for removing patient identifying information from clinical documents.

    PubMed

    Friedlin, F Jeff; McDonald, Clement J

    2008-01-01

    We created a software tool that accurately removes all patient identifying information from various kinds of clinical data documents, including laboratory and narrative reports. We created the Medical De-identification System (MeDS), a software tool that de-identifies clinical documents, and performed 2 evaluations. Our first evaluation used 2,400 Health Level Seven (HL7) messages from 10 different HL7 message producers. After modifying the software based on the results of this first evaluation, we performed a second evaluation using 7,190 pathology report HL7 messages. We compared the results of MeDS de-identification process to a gold standard of human review to find identifying strings. For both evaluations, we calculated the number of successful scrubs, missed identifiers, and over-scrubs committed by MeDS and evaluated the readability and interpretability of the scrubbed messages. We categorized all missed identifiers into 3 groups: (1) complete HIPAA-specified identifiers, (2) HIPAA-specified identifier fragments, (3) non-HIPAA-specified identifiers (such as provider names and addresses). In the results of the first-pass evaluation, MeDS scrubbed 11,273 (99.06%) of the 11,380 HIPAA-specified identifiers and 38,095 (98.26%) of the 38,768 non-HIPAA-specified identifiers. In our second evaluation (status postmodification to the software), MeDS scrubbed 79,993 (99.47%) of the 80,418 HIPAA-specified identifiers and 12,689 (96.93%) of the 13,091 non-HIPAA-specified identifiers. Approximately 95% of scrubbed messages were both readable and interpretable. We conclude that MeDS successfully de-identified a wide range of medical documents from numerous sources and creates scrubbed reports that retain their interpretability, thereby maintaining their usefulness for research.

  14. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  15. A software tool to assist business-process decision-making in the biopharmaceutical industry.

    PubMed

    Mustafa, Mustafa A; Washbrook, John; Lim, Ai Chye; Zhou, Yuhong; Titchener-Hooker, Nigel J; Morton, Philip; Berezenko, Steve; Farid, Suzanne S

    2004-01-01

    Conventionally, software tools for the design of bioprocesses have provided only limited business-related information for decision-making. There is an industrial need to investigate manufacturing options and to gauge the impact of various decisions from economic as well as process perspectives. This paper describes the development and use of a tool to provide an assessment of whole flowsheets by capturing both process and business aspects. The tool is demonstrated by considering the issues concerned when making decisions between two potential flowsheets for a common product. A case study approach is used to compare the process and business benefits of a conventional process route employing packed chromatography beds and an alternative that uses expanded bed adsorption (EBA). The tool allows direct evaluation of the benefits of capital cost reduction and increased yield offered by EBA against penalties of using potentially more expensive EBA matrix with lower lifetimes. Furthermore, the tool provides the ability to gauge the process robustness of each flowsheet option.

  16. MS Data Miner: a web-based software tool to analyze, compare, and share mass spectrometry protein identifications.

    PubMed

    Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J

    2012-09-01

    Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides.

  17. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    ERIC Educational Resources Information Center

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  18. Verification of visual odometry algorithms with an OpenGL-based software tool

    NASA Astrophysics Data System (ADS)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  19. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    NASA Astrophysics Data System (ADS)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  20. Modification, Implementation, and Evaluation of a Remote Terminal Emulator as a Software Validation and Stress Testing Tool.

    DTIC Science & Technology

    1987-12-01

    Dawn, who understood why her dad cuuld not attend all of her athletic events. Craig J. Riesberg ’-% r % ii e% Table of Contents Page Preface... management . The two tools are also compared during the emulation phase of software validation. The RTE package was also examined as a stress testing tool...terminal emulator is the continuation of an effort in software and hardware configuration management which began at the Military Airlift Command in

  1. Development of a software tool and criteria evaluation for efficient design of small interfering RNA

    SciTech Connect

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-07

    Research highlights: {yields} The developed tool predicted siRNA constructs with better thermodynamic stability and total score based on positional and other criteria. {yields} Off-target silencing below score 30 were observed for the best siRNA constructs for different genes. {yields} Immunostimulation and cytotoxicity motifs considered and penalized in the developed tool. {yields} Both positional and compositional criteria were observed to be important. -- Abstract: RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA.

  2. Loader Lite: a new software tool for the ABI PRISM 3700 DNA sequencer.

    PubMed

    Scott, G B I; Steffen, D L; Edgar, D; Warren, J T; Kovár, C L; Scherer, S E; Havlak, P H; Gibbs, R A

    2002-06-01

    Here we describe the development of a novel software tool entitled Loader Lite that generates plate records or sample sheetsfor the ABI PRISMs 3700 DNA sequencer. The major advantage of this program is that it enables the ongoing operation of sequencing instruments without reference to external network(s). The autonomous operation of sequencing instruments is critical if sample throughput is to be maintained during periods of network outage. Loader Lite employs a deliberate strategy of inputting anonymous tray barcodes at run time. After sequencing, the barcodes are reconciled with relevant project details by reference to a database. This software takes advantage of barcode scanning technology by creating plate records directly on the local computer, serving an individual sequencer, immediately before importing and linking. This real-time synthesis of the plate records at the point of loading all but eliminates loading errors. Loader Lite is user-friendly, fully configurable, and permits the running of partial or full 384-well sample trays, using any standard combinations of run modules, dye sets, mobility files, analysis modules, etc. The 96-well format is not supported; however, this capability will appear in subsequent versions that are currently under development. This application is designed as an added value, adjunct program to the regular ABI PRISM 3700 Data Collection software. We have successfully used Loader Lite over the past six months to load approximately 7 million sequencing reactions and believe its utility and functionality will prove to be attractive to the wider sequencing community.

  3. Acts -- A collection of high performing software tools for scientific computing

    SciTech Connect

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  4. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  5. DSSR: an integrated software tool for dissecting the spatial structure of RNA.

    PubMed

    Lu, Xiang-Jun; Bussemaker, Harmen J; Olson, Wilma K

    2015-12-02

    Insight into the three-dimensional architecture of RNA is essential for understanding its cellular functions. However, even the classic transfer RNA structure contains features that are overlooked by existing bioinformatics tools. Here we present DSSR (Dissecting the Spatial Structure of RNA), an integrated and automated tool for analyzing and annotating RNA tertiary structures. The software identifies canonical and noncanonical base pairs, including those with modified nucleotides, in any tautomeric or protonation state. DSSR detects higher-order coplanar base associations, termed multiplets. It finds arrays of stacked pairs, classifies them by base-pair identity and backbone connectivity, and distinguishes a stem of covalently connected canonical pairs from a helix of stacked pairs of arbitrary type/linkage. DSSR identifies coaxial stacking of multiple stems within a single helix and lists isolated canonical pairs that lie outside of a stem. The program characterizes 'closed' loops of various types (hairpin, bulge, internal, and junction loops) and pseudoknots of arbitrary complexity. Notably, DSSR employs isolated pairs and the ends of stems, whether pseudoknotted or not, to define junction loops. This new, inclusive definition provides a novel perspective on the spatial organization of RNA. Tests on all nucleic acid structures in the Protein Data Bank confirm the efficiency and robustness of the software, and applications to representative RNA molecules illustrate its unique features. DSSR and related materials are freely available at http://x3dna.org/.

  6. Data Analysis Software Tools for Enhanced Collaboration at the DIII-D National Fusion Facility

    SciTech Connect

    Schachter, J.; Peng, Q.; Schissel, D.P.

    1999-07-01

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server.

  7. DSSR: an integrated software tool for dissecting the spatial structure of RNA

    PubMed Central

    Lu, Xiang-Jun; Bussemaker, Harmen J.; Olson, Wilma K.

    2015-01-01

    Insight into the three-dimensional architecture of RNA is essential for understanding its cellular functions. However, even the classic transfer RNA structure contains features that are overlooked by existing bioinformatics tools. Here we present DSSR (Dissecting the Spatial Structure of RNA), an integrated and automated tool for analyzing and annotating RNA tertiary structures. The software identifies canonical and noncanonical base pairs, including those with modified nucleotides, in any tautomeric or protonation state. DSSR detects higher-order coplanar base associations, termed multiplets. It finds arrays of stacked pairs, classifies them by base-pair identity and backbone connectivity, and distinguishes a stem of covalently connected canonical pairs from a helix of stacked pairs of arbitrary type/linkage. DSSR identifies coaxial stacking of multiple stems within a single helix and lists isolated canonical pairs that lie outside of a stem. The program characterizes ‘closed’ loops of various types (hairpin, bulge, internal, and junction loops) and pseudoknots of arbitrary complexity. Notably, DSSR employs isolated pairs and the ends of stems, whether pseudoknotted or not, to define junction loops. This new, inclusive definition provides a novel perspective on the spatial organization of RNA. Tests on all nucleic acid structures in the Protein Data Bank confirm the efficiency and robustness of the software, and applications to representative RNA molecules illustrate its unique features. DSSR and related materials are freely available at http://x3dna.org/. PMID:26184874

  8. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    SciTech Connect

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  9. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    NASA Technical Reports Server (NTRS)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  10. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  11. Establishing a Web-based DICOM teaching file authoring tool using open-source public software.

    PubMed

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-09-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our server is built using Apache Web server running on FreeBSD operating system. The dynamic page content is produced by Hypertext Preprocessor (PHP). Digital Imaging and Communications in Medicine images are converted by ImageMagick into Joint Photographic Experts Group (JPEG) format. Digital Imaging and Communications in Medicine attributes are parsed by dicom3tools and stored in PostgreSQL database. Using free software available from the Internet, we build a Web service that allows radiologists to create their own online teaching file cases with a common Web browser.

  12. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  13. MINFIT: A Spreadsheet-Based Tool for Parameter Estimation in an Equilibrium Speciation Software Program.

    PubMed

    Xie, Xiongfei; Giammar, Daniel E; Wang, Zimeng

    2016-10-07

    Deterpmination of equilibrium constants describing chemical reactions in the aqueous phase and at solid-water interface relies on inverse modeling and parameter estimation. Although there are existing tools available, the steep learning curve prevents the wider community of environmental engineers and chemists to adopt those tools. Stemming from classical chemical equilibrium codes, MINEQL+ has been one of the most widely used chemical equilibrium software programs. We developed a spreadsheet-based tool, which we are calling MINFIT, that interacts with MINEQL+ to perform parameter estimations that optimize model fits to experimental data sets. MINFIT enables automatic and convenient screening of a large number of parameter sets toward the optimal solutions by calling MINEQL+ to perform iterative forward calculations following either exhaustive equidistant grid search or randomized search algorithms. The combined use of the two algorithms can securely guide the searches for the global optima. We developed interactive interfaces so that the optimization processes are transparent. Benchmark examples including both aqueous and surface complexation problems illustrate the parameter estimation and associated sensitivity analysis. MINFIT is accessible at http://minfit.strikingly.com .

  14. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  15. Software for Information Storage and Retrieval Tested, Evaluated and Compared: Part VI--Various Additional Programs.

    ERIC Educational Resources Information Center

    Sieverts, Eric G.; And Others

    1993-01-01

    Reports on tests evaluating nine microcomputer software packages designed for information storage and retrieval: BRS-Search, dtSearch, InfoBank, Micro-OPC, Q&A, STN-PFS, Strix, TINman, and ZYindex. Tables and narrative evaluations detail results related to security, hardware, user features, search capability, indexing, input, maintenance of files,…

  16. A Software Tool for Processing the Displacement Time Series Extracted from Raw Radar Data

    NASA Astrophysics Data System (ADS)

    Coppi, Francesco; Gentile, Carmelo; Paolo Ricci, Pier

    2010-05-01

    The application of high-resolution radar waveform and interferometric principles recently led to the development of a microwave interferometer, suitable to simultaneously measuring the (static or dynamic) deflection of several points on a large structure. From the technical standpoint, the sensor is a Stepped Frequency Continuous Wave (SF-CW), coherent radar, operating in the Ku frequency band. In the paper, the main procedures adopted to extract the deflection time series from raw radar data and to assess the quality of data are addressed, and the MATLAB toolbox developed is described. Subsequently, other functions implemented in the software tool (e.g. evaluation of the spectral matrix of the deflection time-histories, identification of natural frequencies and operational mode shapes evaluation) are described and the application to data recorded on full-scale bridges is exemplified.

  17. LEVER: software tools for segmentation, tracking and lineaging of proliferating cells.

    PubMed

    Winter, Mark; Mankowski, Walter; Wait, Eric; Temple, Sally; Cohen, Andrew R

    2016-11-15

    The analysis of time-lapse images showing cells dividing to produce clones of related cells is an important application in biological microscopy. Imaging at the temporal resolution required to establish accurate tracking for vertebrate stem or cancer cells often requires the use of transmitted light or phase-contrast microscopy. Processing these images requires automated segmentation, tracking and lineaging algorithms. There is also a need for any errors in the automated processing to be easily identified and quickly corrected. We have developed LEVER, an open source software tool that combines the automated image analysis for phase-contrast microscopy movies with an easy-to-use interface for validating the results and correcting any errors.

  18. BioBrick assembly standards and techniques and associated software tools.

    PubMed

    Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi

    2014-01-01

    The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database.

  19. A browsing tool for the Internet Logical Library of the HPCC Software Exchange

    NASA Technical Reports Server (NTRS)

    Biro, Ross

    1993-01-01

    As the quantity of information available on the Internet grows, locating a particular piece of information becomes more difficult. One possible solution is for a database of pointers to all available information to be maintained at a central site. Subject classifications for all the information could also be maintained in order to make searching possible. This paper describes one possible method of searching such an index. In particular a prototype browsing tool has been created using TCL/TK to demonstrate several possible features: rapidly scanning at any rank of the index, narrowing the index to any scope, regular-expression searching, and creation of a list of pointers answering to any set of index terms. The prototype browser is an easy-to-use independent X application designed for use in the Catalog of Repositories of the HPCC (High Performance Computing and Communications) Software Exchange.

  20. A protocol building software tool for medical device quality control tests.

    PubMed

    Theodorakos, Y; Gueorguieva, K; Bliznakov, J; Kolitsi, Z; Pallikarakis, N

    1999-01-01

    Q-Pro is an application for Quality Control and Inspection of Medical Devices. General system requirements include friendly and comprehensive graphical environment and proper, quick, easy and intuitive user interface. Functions such as, a tool library for protocol design widely used multimedia, as well as, a support of a local database for protocol and inventory data archiving are provided by the system. In order to serve the different categories of users, involved in Quality Control procedures, the system has been split into three modules of different functionality and complexity, each of which can work as a stand-alone application. The implementation of protocols and use of the software functions, as well as, the user interface itself have been proved by the evaluators to be clear and intuitive. The software seems to adapt easily to different kinds of Quality Control procedures and objectives. Q-Pro effectively supports and enhances the processes to attain a highly tuned, professional, responsive and effective quality control and preventive maintenance procedures for biomedical equipment management.

  1. Introduction of software tools for epidemiological surveillance in infection control in Colombia

    PubMed Central

    Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Introduction: Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. Objective: To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. Methods: An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Results: Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. Conclusions: The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information. PMID:26309340

  2. Open Source Software Openfoam as a New Aerodynamical Simulation Tool for Rocket-Borne Measurements

    NASA Astrophysics Data System (ADS)

    Staszak, T.; Brede, M.; Strelnikov, B.

    2015-09-01

    The only way to do in-situ measurements, which are very important experimental studies for atmospheric science, in the mesoshere/lower thermosphere (MLT) is to use sounding rockets. The drawback of using rockets is the shock wave appearing because of the very high speed of the rocket motion (typically about 1000 mIs). This shock wave disturbs the density, the temperature and the velocity fields in the vicinity of the rocket, compared to undisturbed values of the atmosphere. This effect, however, can be quantified and the measured data has to be corrected not just to make it more precise but simply usable. The commonly accepted and widely used tool for this calculations is the Direct Simulation Monte Carlo (DSMC) technique developed by GA. Bird which is available as stand-alone program limited to use a single processor. Apart from complications with simulations of flows around bodies related to different flow regimes in the altitude range of MLT, that rise due to exponential density change by several orders of magnitude, a particular hardware configuration introduces significant difficulty for aerodynamical calculations due to choice of the grid sizes mainly depending on the demands on adequate DSMCs and good resolution of geometries with scale differences of factor of iO~. This makes either the calculation time unreasonably long or even prevents the calculation algorithm from converging. In this paper we apply the free open source software OpenFOAM (licensed under GNU GPL) for a three-dimensional CFD-Simulation of a flow around a sounding rocket instrumentation. An advantage of this software package, among other things, is that it can run on high performance clusters, which are easily scalable. We present the first results and discuss the potential of the new tool in applications for sounding rockets.

  3. FAMIAS - A userfriendly new software tool for the mode identification of photometric and spectroscopic times series

    NASA Astrophysics Data System (ADS)

    Zima, W.

    2008-12-01

    FAMIAS (Frequency Analysis and Mode Identification for AsteroSeismology) is a collection of state-of-the-art software tools for the analysis of photometric and spectroscopic time series data. It is one of the deliverables of the Work Package NA5: Asteroseismology of the European Coordination Action in Helio- and Asteroseismology (HELAS1 ). Two main sets of tools are incorporated in FAMIAS. The first set allows to search for pe- riodicities in the data using Fourier and non-linear least-squares fitting algorithms. The other set allows to carry out a mode identification for the detected pulsation frequencies to deter- mine their pulsational quantum numbers, the harmonic degree, ℓ, and the azimuthal order, m. For the spectroscopic mode identification, the Fourier parameter fit method and the moment method are available. The photometric mode identification is based on pre-computed grids of atmospheric parameters and non-adiabatic observables, and uses the method of amplitude ratios and phase differences in different filters. The types of stars to which FAMIAS is appli- cable are main-sequence pulsators hotter than the Sun. This includes the Gamma Dor stars, Delta Sct stars, the slowly pulsating B stars and the Beta Cep stars - basically all pulsating main-sequence stars, for which empirical mode identification is required to successfully carry out asteroseismology. The complete manual for FAMIAS is published in a special issue of Communications in Asteroseismology, Vol 155. The homepage of FAMIAS2 provides the possibility to download the software and to read the on-line documentation.

  4. GenoMass software: a tool based on electrospray ionization tandem mass spectrometry for characterization and sequencing of oligonucleotide adducts

    PubMed Central

    Sharma, Vaneet K; Glick, James; Liao, Qing; Shen, Chang; Vouros, Paul

    2012-01-01

    The analysis of DNA adducts is of importance in understanding DNA damage, and in the last few years mass spectrometry (MS) has emerged as the most comprehensive and versatile tool for routine characterization of modified oligonucleotides. The structural analysis of modified oligonucleotides, although routinely analyzed using mass spectrometry, is followed by a large amount of data, and a significant challenge is to locate the exact position of the adduct by computational spectral interpretation, which still is a bottleneck. In this report, we present an additional feature of the in-house developed GenoMass software, which determines the exact location of an adduct in modified oligonucleotides by connecting tandem mass spectrometry (MS/MS) to a combinatorial isomer library generated in silico for nucleic acids. The performance of this MS/MS approach using GenoMass software was evaluated by MS/MS data interpretation for an unadducted and its corresponding N-acetylaminofluorene (AAF) adducted 17-mer (5′OH-CCT ACC CCT TCC TTG TA-3′OH) oligonucleotide. Further computational screening of this AAF adducted 17-mer oligonucleotide (5′OH-CCT ACC CCT TCC TTG TA-3′OH) from a complex oligonucleotide mixture was performed using GenoMass. Finally, GenoMass was also used to identify the positional isomers of the AAF adducted 15-mer oligonucleotide (5′OH-ATGAACCGGAGGCCC-3′OH). GenoMass is a simple, fast, data interpretation software that uses an in silico constructed library to relate the MS/MS sequencing approach to identify the exact location of adduct on oligonucleotides. PMID:22689626

  5. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    NASA Astrophysics Data System (ADS)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  6. Application of a Software tool for Evaluating Human Factors in Accident Sequences

    SciTech Connect

    Queral, Cesar; Exposito, Antonio; Gonzalez, Isaac

    2006-07-01

    The Probabilistic Safety Assessment (PSA) includes the actions of the operator like elements in the set of the considered protection performances during accident sequences. Nevertheless, its impact throughout a sequence is not analyzed in a dynamic way. In this sense, it is convenient to make more detailed studies about its importance in the dynamics of the sequences, letting make studies of sensitivity respect to the human reliability and the response times. For this reason, the CSN is involved in several activities oriented to develop a new safety analysis methodology, the Integrated Safety Assessment (ISA), which must be able to incorporate operator actions in conventional thermo-hydraulic (TH) simulations. One of them is the collaboration project between CSN, HRP and the DSE-UPM that started in 2003. In the framework of this project, a software tool has been developed to incorporate operator actions in TH simulations. As a part of the ISA, this tool permits to quantify human error probabilities (HEP) and to evaluate its impact in the final state of the plant. Independently, it can be used for evaluating the impact of the execution by operators of procedures and guidelines in the final state of the plant and the evaluation of the allowable response times for the manual actions of the operator. The results obtained in the first pilot case are included in this paper. (authors)

  7. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    NASA Technical Reports Server (NTRS)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  8. PentaPlot: A software tool for the illustration of genome mosaicism

    PubMed Central

    Hamel, Lutz; Zhaxybayeva, Olga; Gogarten, J Peter

    2005-01-01

    Background Dekapentagonal maps depict the phylogenetic relationships of five genomes in a visually appealing diagram and can be viewed as an alternative to a single evolutionary consensus tree. In particular, the generated maps focus attention on those gene families that significantly deviate from the consensus or plurality phylogeny. PentaPlot is a software tool that computes such dekapentagonal maps given an appropriate probability support matrix. Results The visualization with dekapentagonal maps critically depends on the optimal layout of unrooted tree topologies representing different evolutionary relationships among five organisms along the vertices of the dekapentagon. This is a difficult optimization problem given the large number of possible layouts. At its core our tool utilizes a genetic algorithm with demes and a local search strategy to search for the optimal layout. The hybrid genetic algorithm performs satisfactorily even in those cases where the chosen genomes are so divergent that little phylogenetic information has survived in the individual gene families. Conclusion PentaPlot is being made publicly available as an open source project at . PMID:15938752

  9. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  10. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation.

  11. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    PubMed

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  12. TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.

    2012-12-01

    A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time

  13. Should We Have Blind Faith in Bioinformatics Software? Illustrations from the SNAP Web-Based Tool

    PubMed Central

    Robiou-du-Pont, Sébastien; Li, Aihua; Christie, Shanice; Sohani, Zahra N.; Meyre, David

    2015-01-01

    Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs) associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY) birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37). With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%). With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any ‘false positive’ SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file). The Cohen’s Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes). In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation. PMID:25742008

  14. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  15. Use of slide presentation software as a tool to measure hip arthroplasty wear.

    PubMed

    Yun, Ho Hyun; Jajodia, Nirmal K; Myung, Jae Sung; Oh, Jong Keon; Park, Sang Won; Shon, Won Yong

    2009-12-01

    The authors propose a manual measurement method for wear in total hip arthroplasty (PowerPoint method) based on the well-known Microsoft PowerPoint software (Microsoft Corporation, Redmond, Wash). In addition, the accuracy and reproducibility of the devised method were quantified and compared with two methods previously described by Livermore and Dorr, and accuracies were determined at different degrees of wear. The 57 hips recruited were allocated to: class 1 (retrieval series), class 2 (clinical series), and class 3 (a repeat film analysis series). The PowerPoint method was found to have good reproducibility and to better detect wear differences between classes. The devised method can be easily used for recording wear at follow-up visits and could be used as a supplementary method when computerized methods cannot be employed.

  16. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  17. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    ERIC Educational Resources Information Center

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  18. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    NASA Astrophysics Data System (ADS)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  19. Prognostic 2.0: software tool for heart rate variability analysis and QT interval dispersion

    NASA Astrophysics Data System (ADS)

    Mendoza, Alfonso; Rueda, Oscar L.; Bautista, Lola X.; Martinez, Víctor E.; Lopez, Eddie R.; Gomez, Mario F.; Alvarez, Alexander

    2007-09-01

    Cardiovascular diseases, in particular Acute Myocardial Infarction (AMI) are the first cause of death in industrialized countries. Measurements of indicators of the behavior of the autonomic nervous system, such as the Heart Rate Variability (HRV) and the QT Interval Dispersion (QTD) in the acute phase of the AMI (first 48 hours after the event) give a good estimation of the subsequent cardiac events that could present a person who had suffered an AMI. This paper describes the implementation of the second version of Prognostic-AMI, a software tool that automate the calculation of such indicators. It uses the Discrete Wavelet Transform (DWT) to de-noise the signals an to detect the QRS complex and the T-wave from a conventional electrocardiogram of 12 leads. Indicators are measured in both time and frequency domain. A pilot trial performed on a sample population of 76 patients shows that people who had had cardiac complications in the acute phase of the AMI have low values in the indicators of HRV and QTD.

  20. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    PubMed

    DeJesus, Michael A; Ambadipudi, Chaitra; Baker, Richard; Sassetti, Christopher; Ioerger, Thomas R

    2015-10-01

    TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  1. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  2. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    PubMed

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  3. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    NASA Technical Reports Server (NTRS)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  4. Productivity, part 2: cloud storage, remote meeting tools, screencasting, speech recognition software, password managers, and online data backup.

    PubMed

    Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Lalwani, Neeraj; Lall, Chandana; Bhargava, Puneet

    2014-06-01

    It is an opportune time for radiologists to focus on personal productivity. The ever increasing reliance on computers and the Internet has significantly changed the way we work. Myriad software applications are available to help us improve our personal efficiency. In this article, the authors discuss some tools that help improve collaboration and personal productivity, maximize e-learning, and protect valuable digital data.

  5. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    ERIC Educational Resources Information Center

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  6. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    ERIC Educational Resources Information Center

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  7. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    EPA Science Inventory

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  8. Students' Learning Experiences When Using a Dynamic Geometry Software Tool in a Geometry Lesson at Secondary School in Ethiopia

    ERIC Educational Resources Information Center

    Denbel, Dejene Girma

    2015-01-01

    Students learning experiences were investigated in geometry lesson when using Dynamic Geometry Software (DGS) tool in geometry learning in 25 Ethiopian secondary students. The research data were drawn from the used worksheets, classroom observations, results of pre- and post-test, a questionnaire and interview responses. I used GeoGebra as a DGS…

  9. A software tool for stitching two PET/CT body segments into a single whole-body image set.

    PubMed

    Chang, Tingting; Chang, Guoping; Clark, John W; Rohren, Eric M; Mawlawi, Osama R

    2012-05-10

    A whole-body PET/CT scan extending from the vertex of the head to the toes of the patient is not feasible on a number of commercially available PET/CT scanners due to a limitation in the extent of bed travel on these systems. In such cases, the PET scan has to be divided into two parts: one covering the upper body segment, while the other covering the lower body segment. The aim of this paper is to describe and evaluate, using phantom and patient studies, a software tool that was developed to stitch two body segments and output a single whole-body image set, thereby facilitating the interpretation of whole-body PET scans. A mathematical model was first developed to stitch images from two body segments using three landmarks. The model calculates the relative positions of the landmarks on the two segments and then generates a rigid transformation that aligns these landmarks on the two segments. A software tool was written to implement this model while correcting for radioactive decay between the two body segments, and output a single DICOM whole-body image set with all the necessary tags. One phantom, and six patient studies were conducted to evaluate the performance of the software. In these studies, six radio-opaque markers (BBs) were used as landmarks (three on each leg). All studies were acquired in two body segments with BBs placed in the overlap region of the two segments. The PET/CT images of each segment were then stitched using the software tool to create a single DICOM whole-body PET/CT image. Evaluation of the stitching tool was based on visual inspection, consistency of radiotracer uptake in the two segments, and ability to display the resultant DICOM image set on two independent workstations. The software tool successfully stitched the two segments of the phantom image, and generated a single whole-body DICOM PET/CT image set that had the correct alignment and activity concentration throughout the image. The stitched images were viewed by two independent

  10. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  11. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    NASA Astrophysics Data System (ADS)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such

  12. YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities

    PubMed Central

    Schwarz, Roland; Musch, Patrick; von Kamp, Axel; Engels, Bernd; Schirmer, Heiner; Schuster, Stefan; Dandekar, Thomas

    2005-01-01

    Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA) has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL), edit (including support for the SBML format), visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM) activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA) for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints) in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at . Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user-friendly software for such efforts. PMID:15929789

  13. Technical Note: Approaches and software tools to investigate the impact of ocean acidification

    NASA Astrophysics Data System (ADS)

    Gattuso, J.-P.; Lavigne, H.

    2009-10-01

    Although future changes in the seawater carbonate chemistry are well constrained, their impact on marine organisms and ecosystems remains poorly known. The biological response to ocean acidification is a recent field of research as most purposeful experiments have only been carried out in the late 1990s. The potentially dire consequences of ocean acidification attract scientists and students with a limited knowledge of the carbonate chemistry and its experimental manipulation. Hence, some guidelines on carbonate chemistry manipulations may be helpful for the growing ocean acidification community to maintain comparability. Perturbation experiments are one of the key approaches used to investigate the biological response to elevated pCO2. They are based on measurements of physiological or metabolic processes in organisms and communities exposed to seawater with normal or altered carbonate chemistry. Seawater chemistry can be manipulated in different ways depending on the facilities available and on the question being addressed. The goal of this paper is (1) to examine the benefits and drawbacks of various manipulation techniques and (2) to describe a new version of the R software package seacarb which includes new functions aimed at assisting the design of ocean acidification perturbation experiments. Three approaches closely mimic the on-going and future changes in the seawater carbonate chemistry: gas bubbling, addition of high-CO2 seawater as well as combined additions of acid and bicarbonate and/or carbonate.

  14. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    SciTech Connect

    Sterling, T.; Messina, P.; Chen, M.

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  15. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    PubMed

    Lewis, Sheri L; Feighner, Brian H; Loschen, Wayne A; Wojcik, Richard A; Skora, Joseph F; Coberly, Jacqueline S; Blazes, David L

    2011-05-10

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  16. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  17. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  18. Integrated Design and Analysis Tools for Software-Based Control Systems

    DTIC Science & Technology

    2005-07-01

    new project, FRESCO , on faithfully implementing hybrid models in real-time software. Beyond HyTech: Hybrid Systems Analysis Using Interval Numerical...terminating when no new states are encountered. This enables model checking of reachability properties. FRESCO : Formal Real-time Software Components We...software concepts on autonomous model helicopters. Fresco Ben Horowitz and Christoph Meyer have finished a draft implementation of a skeleton of

  19. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming

    PubMed Central

    Rosenberg, Michael; Lay, Brendan S.; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  20. A flexible, interactive software tool for fitting the parameters of neuronal models

    PubMed Central

    Friedrich, Péter; Vella, Michael; Gulyás, Attila I.; Freund, Tamás F.; Káli, Szabolcs

    2014-01-01

    The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID

  1. A flexible, interactive software tool for fitting the parameters of neuronal models.

    PubMed

    Friedrich, Péter; Vella, Michael; Gulyás, Attila I; Freund, Tamás F; Káli, Szabolcs

    2014-01-01

    The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool.

  2. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    PubMed

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  3. Biomedical Mutation Analysis (BMA): A software tool for analyzing mutations associated with antiviral resistance

    PubMed Central

    Salvatierra, Karina; Florez, Hector

    2016-01-01

    Introduction: Hepatitis C virus (HCV) is considered a major public health problem, with 200 million people infected worldwide. The treatment for HCV chronic infection with pegylated interferon alpha plus ribavirin inhibitors is unspecific; consequently, the treatment is effective in only 50% of patients infected. This has prompted the development of direct-acting antivirals (DAA) that target virus proteins. These DAA have demonstrated a potent effect in vitro and in vivo; however, virus mutations associated with the development of resistance have been described. Objective: To design and develop an online information system for detecting mutations in amino acids known to be implicated in resistance to DAA. Materials and methods:    We have used computer applications, technological tools, standard languages, infrastructure systems and algorithms, to analyze positions associated with resistance to DAA for the NS3, NS5A, and NS5B genes of HCV. Results: We have designed and developed an online information system named Biomedical Mutation Analysis (BMA), which allows users to calculate changes in nucleotide and amino acid sequences for each selected sequence from conventional Sanger and cloning sequencing using a graphical interface. Conclusion: BMA quickly, easily and effectively analyzes mutations, including complete documentation and examples. Furthermore, the development of different visualization techniques allows proper interpretation and understanding of the results. The data obtained using BMA will be useful for the assessment and surveillance of HCV resistance to new antivirals, and for the treatment regimens by selecting those DAA to which the virus is not resistant, avoiding unnecessary treatment failures. The software is available at: http://bma.itiud.org. PMID:27547378

  4. Tools to Support the Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin

    2011-01-01

    The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (SRWG) is chartered with the investigation, production, and dissemination of information related to the reuse of NASA Earth science software assets. One major current objective is to engage the NASA decadal missions in areas relevant to software reuse. In this paper we report on the current status of these activities. First, we provide some background on the SRWG in general and then discuss the group s flagship recommendation, the NASA Reuse Readiness Levels (RRLs). We continue by describing areas in which mission software may be reused in the context of NASA decadal missions. We conclude the paper with pointers to future directions.

  5. GEnomes Management Application (GEM.app): A new software tool for large-scale collaborative genome analysis

    PubMed Central

    Gonzalez, Michael A.; Acosta Lebrigio, Rafael F.; Van Booven, Derek; Ulloa, Rick H.; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schule, Rebecca; Zuchner, Stephan

    2015-01-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ~1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for non-bioinformaticians to make NGS data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 seconds across ~1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. PMID:23463597

  6. Investigation of the environmental impacts of municipal wastewater treatment plants through a Life Cycle Assessment software tool.

    PubMed

    De Feo, G; Ferrara, C

    2016-10-11

    This paper investigates the total and per capita environmental impacts of municipal wastewater treatment in the function of the population equivalent (PE) with a Life Cycle Assessment (LCA) approach using the processes of the Ecoinvent 2.2 database available in the software tool SimaPro v.7.3. Besides the wastewater treatment plant (WWTP), the study also considers the sewerage system. The obtained results confirm that there is a 'scale factor' for the wastewater collection and treatment even in environmental terms, in addition to the well-known scale factor in terms of management costs. Thus, the more the treatment plant size is, the less the per capita environmental impacts are. However, the Ecoinvent 2.2 database does not contain information about treatment systems with a capacity lower than 30 PE. Nevertheless, worldwide there are many sparsely populated areas, where it is not convenient to realize a unique centralized WWTP. Therefore, it would be very important to conduct an LCA study in order to compare alternative on-site small-scale systems with treatment capacity of few PE.

  7. Detecting variants with Metabolic Design, a new software tool to design probes for explorative functional DNA microarray development

    PubMed Central

    2010-01-01

    Background Microorganisms display vast diversity, and each one has its own set of genes, cell components and metabolic reactions. To assess their huge unexploited metabolic potential in different ecosystems, we need high throughput tools, such as functional microarrays, that allow the simultaneous analysis of thousands of genes. However, most classical functional microarrays use specific probes that monitor only known sequences, and so fail to cover the full microbial gene diversity present in complex environments. We have thus developed an algorithm, implemented in the user-friendly program Metabolic Design, to design efficient explorative probes. Results First we have validated our approach by studying eight enzymes involved in the degradation of polycyclic aromatic hydrocarbons from the model strain Sphingomonas paucimobilis sp. EPA505 using a designed microarray of 8,048 probes. As expected, microarray assays identified the targeted set of genes induced during biodegradation kinetics experiments with various pollutants. We have then confirmed the identity of these new genes by sequencing, and corroborated the quantitative discrimination of our microarray by quantitative real-time PCR. Finally, we have assessed metabolic capacities of microbial communities in soil contaminated with aromatic hydrocarbons. Results show that our probe design (sensitivity and explorative quality) can be used to study a complex environment efficiently. Conclusions We successfully use our microarray to detect gene expression encoding enzymes involved in polycyclic aromatic hydrocarbon degradation for the model strain. In addition, DNA microarray experiments performed on soil polluted by organic pollutants without prior sequence assumptions demonstrate high specificity and sensitivity for gene detection. Metabolic Design is thus a powerful, efficient tool that can be used to design explorative probes and monitor metabolic pathways in complex environments, and it may also be used to

  8. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    NASA Technical Reports Server (NTRS)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  9. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  10. Software tools for developing parallel applications. Part 2: Interactive control and performance tuning

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    This paper continues the discussion of parallel tool support with an overview of the current state of tools for runtime control and performance tuning. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  11. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  12. LDLR Database (second edition): new additions to the database and the software, and results of the first molecular analysis.

    PubMed Central

    Varret, M; Rabés, J P; Thiart, R; Kotze, M J; Baron, H; Cenarro, A; Descamps, O; Ebhardt, M; Hondelijn, J C; Kostner, G M; Miyake, Y; Pocovi, M; Schmidt, H; Schuster, H; Stuhrmann, M; Yamamura, T; Junien, C; Béroud, C; Boileau, C

    1998-01-01

    Mutations in the LDL receptor gene (LDLR) cause familial hypercholesterolemia (FH), a common autosomal dominant disorder. The LDLR database is a computerized tool that has been developed to provide tools to analyse the numerous mutations that have been identified in the LDLR gene. The second version of the LDLR database contains 140 new entries and the software has been modified to accommodate four new routines. The analysis of the updated data (350 mutations) gives the following informations: (i) 63% of the mutations are missense, and only 20% occur in CpG dinucleotides; (ii) although the mutations are widely distributed throughout the gene, there is an excess of mutations in exons 4 and 9, and a deficit in exons 13 and 15; (iii) the analysis of the distribution of mutations located within the ligand-binding domain shows that 74% of the mutations in this domain affect a conserved amino-acid, and that they are mostly confined in the C-terminal region of the repeats. Conversely, the same analysis in the EGF-like domain shows that 64% of the mutations in this domain affect a non-conserved amino-acid, and, that they are mostly confined in the N-terminal half of the repeats. The database is now accessible on the World Wide Web at http://www.umd.necker.fr PMID:9399845

  13. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  14. WSXM: a software for scanning probe microscopy and a tool for nanotechnology.

    PubMed

    Horcas, I; Fernández, R; Gómez-Rodríguez, J M; Colchero, J; Gómez-Herrero, J; Baro, A M

    2007-01-01

    In this work we briefly describe the most relevant features of WSXM, a freeware scanning probe microscopy software based on MS-Windows. The article is structured in three different sections: The introduction is a perspective on the importance of software on scanning probe microscopy. The second section is devoted to describe the general structure of the application; in this section the capabilities of WSXM to read third party files are stressed. Finally, a detailed discussion of some relevant procedures of the software is carried out.

  15. Numerical arc segmentation algorithm for a radio conference - A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    A detailed description of a Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software package for communication satellite systems planning is presented. This software provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC - 88) on the use of the GEO and the planning of space services utilizing GEO. The features of the NASARC software package are described, and detailed information is given about the function of each of the four NASARC program modules. The results of a sample world scenario are presented and discussed.

  16. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  17. Software Tools for Lifetime Assessment of Thermal Barrier Coatings Part I — Thermal Ageing Failure and Thermal Fatigue Failure

    NASA Astrophysics Data System (ADS)

    Renusch, Daniel; Rudolphi, Mario; Schütze, Michael

    Thermal barrier coatings (TBCs) increase the service lifetime of specific components in, for example, gas turbines or airplane engines and allow higher operating temperatures to increase efficiency. Lifetime prediction models are therefore of both academic and applied interest; either to test new coatings or to determine operational conditions that can ensure a certain lifetime, for example 25,000 hr for gas turbines. Driven by these demands, the equations used in lifetime prediction have become more and more sophisticated and consequently are complicated to apply. A collection of software tools for lifetime assessment was therefore developed to provide an easy to use graphical user interface whilst incorporating the recent improvements in modeling equations. The Windows based software is compatible with other Windows applications, such as, Power Point, Excel, or Origin. Laboratory lifetime data from isothermal, thermal cyclic and/or burner rig testing can be loaded into the software for analysis and the program provides confidence limits and an accuracy assessment of the analysis model. The main purpose of the software tool is to predict TBC spallation for a given bond coat temperature, temperature gradient across the coating, and thermal cycle frequency.

  18. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    PubMed

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users.

  19. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    DTIC Science & Technology

    2011-05-10

    health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular...concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility...the scope of reportable conditions and are intended to help prevent and respond to global public health threats. SAGES, an electronic biosurveillance

  20. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    USGS Publications Warehouse

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  1. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    NASA Astrophysics Data System (ADS)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  2. Particle Loss Calculator - a new software tool for the assessment of the performance of aerosol inlet systems

    NASA Astrophysics Data System (ADS)

    von der Weiden, S.-L.; Drewnick, F.; Borrmann, S.

    2009-09-01

    Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC) that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements through a constant-diameter sampling probe. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  3. Particle Loss Calculator - a new software tool for the assessment of the performance of aerosol inlet systems

    NASA Astrophysics Data System (ADS)

    von der Weiden, S.-L.; Drewnick, F.; Borrmann, S.

    2009-04-01

    Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC) that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  4. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  5. BiQ Analyzer HiMod: an interactive software tool for high-throughput locus-specific analysis of 5-methylcytosine and its oxidized derivatives.

    PubMed

    Becker, Daniel; Lutsik, Pavlo; Ebert, Peter; Bock, Christoph; Lengauer, Thomas; Walter, Jörn

    2014-07-01

    Recent data suggest important biological roles for oxidative modifications of methylated cytosines, specifically hydroxymethylation, formylation and carboxylation. Several assays are now available for profiling these DNA modifications genome-wide as well as in targeted, locus-specific settings. Here we present BiQ Analyzer HiMod, a user-friendly software tool for sequence alignment, quality control and initial analysis of locus-specific DNA modification data. The software supports four different assay types, and it leads the user from raw sequence reads to DNA modification statistics and publication-quality plots. BiQ Analyzer HiMod combines well-established graphical user interface of its predecessor tool, BiQ Analyzer HT, with new and extended analysis modes. BiQ Analyzer HiMod also includes updates of the analysis workspace, an intuitive interface, a custom vector graphics engine and support of additional input and output data formats. The tool is freely available as a stand-alone installation package from http://biq-analyzer-himod.bioinf.mpi-inf.mpg.de/.

  6. Mössbauer spectroscopy: an excellent additional tool for the study of magnetic soils and sediments

    NASA Astrophysics Data System (ADS)

    Vandenberghe, R. E.; Hus, J. J.; de Grave, E.

    2009-04-01

    Since the discovery a half century ago of the resonant gamma absorption, known as the Mössbauer effect, the derived spectroscopic method (MS) has proven to be a very suitable tool for the characterization of soil and rock minerals. From the conventional absorption spectra of iron containing compounds, so-called hyperfine parameters are derived which are more or less typical for each kind of mineral. So, MS has a certain analytical power for the characterization of iron-bearing minerals. This is especially true for magnetic minerals for which the spectrum contains an additional hyperfine parameter. Moreover, MS also allows retrieving information about the magnetic structure and behavior. Because the relative area of the spectra is to some extent proportional to the amount of iron atoms in their environment, MS yields not only quantitative information about the various minerals present but also about the iron in the different crystallographic sites. The power of MS as an excellent additional tool for the study of magnetic soils and sediments could be well demonstrated in the joint research with Jozef Hus (CPG-IRM, Dourbes). In our common work, the emphasis went mainly to the study of Chinese loess and soils. Using MS on magnetically separated samples the various magnetic species in a loess and its associated soil were for the first time discerned in a direct way. Further, magnetically enriched samples of four different loess/paleosol couplets from a loess sequence in Huangling have been systematically investigated by MS. From the obtained qualitative and quantitative information the neoformation of magnetite/maghemite in the soils, responsible for the increased observed remanence and susceptibility, could be evidenced.

  7. Gammasphere software development. Progress report

    SciTech Connect

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  8. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis.

    PubMed

    Botton-Divet, Léo; Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages ('Edgewarp' and 'Morpho') for the same sliding task, and investigate potential differences in the results and biological interpretation. 'Morpho' is much faster than 'Edgewarp,' notably as a result of the greater computational power of the 'Morpho' software routines and the complexity of the 'Edgewarp' workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses.

  9. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis

    PubMed Central

    Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages (‘Edgewarp’ and ‘Morpho’) for the same sliding task, and investigate potential differences in the results and biological interpretation. ‘Morpho’ is much faster than ‘Edgewarp,’ notably as a result of the greater computational power of the ‘Morpho’ software routines and the complexity of the ‘Edgewarp’ workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  10. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  11. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    NASA Astrophysics Data System (ADS)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  12. The ERATO Systems Biology Workbench: enabling interaction and exchange between software tools for computational biology.

    PubMed

    Hucka, M; Finney, A; Sauro, H M; Bolouri, H; Doyle, J; Kitano, H

    2002-01-01

    Researchers in computational biology today make use of a large number of different software packages for modeling, analysis, and data manipulation and visualization. In this paper, we describe the ERATO Systems Biology Workbench (SBW), a software framework that allows these heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' data and algorithmic capabilities. Our goal is to create a simple, open-source software infrastructure which is effective, easy to implement and easy to understand. SBW uses a broker-based architecture and enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe the SBW architecture and the current set of modules, as well as alternative implementation technologies.

  13. Development of a case tool to support decision based software development

    NASA Technical Reports Server (NTRS)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  14. Validation of a Low-Thrust Mission Design Tool Using Operational Navigation Software

    NASA Technical Reports Server (NTRS)

    Englander, Jacob A.; Knittel, Jeremy M.; Williams, Ken; Stanbridge, Dale; Ellison, Donald H.

    2017-01-01

    Design of flight trajectories for missions employing solar electric propulsion requires a suitably high-fidelity design tool. In this work, the Evolutionary Mission Trajectory Generator (EMTG) is presented as a medium-high fidelity design tool that is suitable for mission proposals. EMTG is validated against the high-heritage deep-space navigation tool MIRAGE, demonstrating both the accuracy of EMTG's model and an operational mission design and navigation procedure using both tools. The validation is performed using a benchmark mission to the Jupiter Trojans.

  15. ARCHER, a New Monte Carlo Software Tool for Emerging Heterogeneous Computing Environments

    NASA Astrophysics Data System (ADS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2014-06-01

    The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented.

  16. Software Tools for Software Maintenance

    DTIC Science & Technology

    1988-10-01

    I- IIBM Main DOS, OS! None Spoc I Quikjob I AI ----------- ------------ I ----------------------------- I-- IBM Main OS Cobol I...Cobol Debug A I ---------------------- -------------------------- i IBM Main OS Cobol I Quick Online Debugging System A I ----------------------I...Debug Assembler JSCDebu&.Cobot Debug Cobol QUODS (Qusk Online Dtbugring System) Cobol Superbug Antmblcr Trace Ary Tracer Fortn Assembler X)tbug

  17. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  18. VoIPNET: A Software Based Communications Tool for Low-Bandwidth Networks

    DTIC Science & Technology

    2007-06-01

    deploy full duplex telephone communications services to bandwidth deprived organizations via an existing wireless network infrastructure. The...voice, internet, protocol, sip, rtp, video , software, specification 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18...communications, it is half-duplex, cumbersome, unreliable, and subject to availability due to net traffic. Voice over IP may be the solution to deploy full

  19. Software tools for data modelling and processing of human body temperature circadian dynamics.

    PubMed

    Petrova, Elena S; Afanasova, Anastasia I

    2015-01-01

    This paper is presenting a software development for simulating and processing thermometry data. The motivation of this research is the miniaturization of actuators attached to human body which allow frequent temperature measurements and improve the medical diagnosis procedures related to circadian dynamics.

  20. The Use of Geogebra Software as a Calculus Teaching and Learning Tool

    ERIC Educational Resources Information Center

    Nobre, Cristiane Neri; Meireles, Magali Rezende Gouvêa; Vieira, Niltom, Jr.; de Resende, Mônica Neli; da Costa, Lucivânia Ester; da Rocha, Rejane Corrêa

    2016-01-01

    Information and Communication Technologies (ICT) in education provide a new learning environment where the student builds his own knowledge, allowing his visualization and experimentation. This study evaluated the Geogebra software in the learning process of Calculus. It was observed that the proposed activities helped in the graphical…

  1. Development of a Software Tool for Calculating Transmission Line Parameters and Updating Related Databases

    NASA Astrophysics Data System (ADS)

    Xiu, Wanjing; Liao, Yuan

    2014-12-01

    Transmission lines are essential components of electric power grids. Diverse power system applications and simulation based studies require transmission line parameters including series resistance, reactance, and shunt susceptance, and accurate parameters are pivotal in ensuring the accuracy of analyses and reliable system operation. Commercial software packages for performing power system studies usually have their own databases that store the power system model including line parameters. When there is a physical system model change, the corresponding component in the database of the software packages will need to be modified. Manually updating line parameters are tedious and error-prone. This paper proposes a solution for streamlining the calculation of line parameters and updating of their values in respective software databases. The algorithms used for calculating the values of line parameters are described. The software developed for implementing the solution is described, and typical results are presented. The proposed solution is developed for a utility and has a potential to be put into use by other utilities.

  2. Automated Tools for Test and Analysis of Radar Warning Receiver Software.

    DTIC Science & Technology

    1982-12-01

    software requirements for the Data Extraction and Analysis system. The requirements were derived from the user, environmental , and other requirements...transform[rwr] := RWR ’ transform~ quiti : ’QUIT repeat helplevel =min; good..input :-false; (loop until valid command) while not good-..input do begin write

  3. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    ERIC Educational Resources Information Center

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  4. TeraTools: Multiparameter data acquisition software for the Windows 95/NT OS

    SciTech Connect

    Piercey, R.B.

    1997-12-31

    TeraTools, a general purpose, multiparameter, data acquisition application for Windows 95NT is described. It is based on the Kmax architecture which has been used since 1986 on the Macintosh computer at numerous industrial, education, and research sites world-wide. TeraTools includes high-level support for industry-standard modular instrumentation; a built-in scripting language; drivers for commercially available interfaces; hooks for external code extensions; event file sorting and replay; and a full set of histogramming and display tools. The environment is scalable and may be applied to problems involving a few parameters or many parameters.

  5. Creating Interoperable Meshing and Discretization Software: The Terascale Simulation Tools and Technology Center

    SciTech Connect

    Brown, D.; Freitag, L.; Glimm, J.

    2002-03-28

    We present an overview of the technical objectives of the Terascale Simulation Tools and Technologies center. The primary goal of this multi-institution collaboration is to develop technologies that enable application scientists to easily use multiple mesh and discretization strategies within a single simulation on terascale computers. The discussion focuses on our efforts to create interoperable mesh generation tools, high-order discretization techniques, and adaptive meshing strategies.

  6. Integrating Commercial Off-the-Shelf Tools for Custom Software Development

    DTIC Science & Technology

    1992-06-01

    we appreciate thc extensive management support and helpful suggestions that we received from Ed Fitzgerald, Steve Harris, Stu Jolly, and Dave White...the proven graphical nature of these operating systems makes them naturally suitable for modem application development. 3.1 HMI DEVELOPMENT TOOLS In...5 Hypermedia is an extremely effective tool for implementing proof-of-concept rapid prototypes as well as operational prototypes. Hypermedia is used

  7. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    PubMed

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects.

  8. Techniques and Tools for Trustworthy Composition of Pre-Designed Embedded Software Components

    DTIC Science & Technology

    2012-07-01

    that was developed as a part of the Arduino open-source electronics prototyping platform. The Ardupilot system consists of the hardware which is placed...hand of the user is used to communicate the roll, pitch and the throttle information to the UAV. The firmware for the system is written in the Arduino ...Measurement Unit (IMU) sensors it can be used to develop an Unmanned Aerial Vehicle. Software for the Ardupilot can be programmed using the Arduino

  9. Instrument-independent software tools for the analysis of MS-MS and LC-MS lipidomics data.

    PubMed

    Haimi, Perttu; Chaithanya, Krishna; Kainu, Ville; Hermansson, Martin; Somerharju, Pentti

    2009-01-01

    Mass spectrometry (MS), particularly electrospray-MS, is the key tool in modern lipidomics. However, as even a modest scale experiment produces a great amount of data, data processing often becomes limiting. Notably, the software provided with MS instruments are not well suited for quantitative analysis of lipidomes because of the great variety of species present and complexities in response calibration. Here we describe the use of two recently introduced software tools: lipid mass spectrum analysis (LIMSA) and spectrum extraction from chromatographic data (SECD), which significantly increase the speed and reliability of mass spectrometric analysis of complex lipidomes. LIMSA is a Microsoft Excel add-on that (1) finds and integrates the peaks in an imported spectrum, (2) identifies the peaks, (3) corrects the peak areas for overlap by isotopic peaks of other species and (4) quantifies the identified species using included internal standards. LIMSA is instrument-independent because it processes text-format MS spectra. Typically, the analysis of one spectrum takes only a few seconds.The SECD software allows one to display MS chromatograms as two-dimensional maps, which is useful for visual inspection of the data. More importantly, however, SECD allows one to extract mass spectra from user-defined regions of the map for further analysis with, e.g., LIMSA. The use of select regions rather than simple time-range averaging significantly improves the signal-to-noise ratio as signals outside the region of interest are more efficiently excluded. LIMSA and SECD have proven to be robust and convenient tools and are available free of charge from the authors.

  10. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    NASA Astrophysics Data System (ADS)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  11. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    NASA Astrophysics Data System (ADS)

    Marshall, I. P. G.

    2014-12-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work with a wide range of core and section configurations and can thus be used in future drilling projects. Corganiser is written in the Python programming language and is implemented both as a graphical web interface and command-line interface. It can be accessed online at http://130.226.247.137/.

  12. USER'S GUIDE: Strategic Waste Minimization Initiative (SWAMI) Version 2.0 - A Software Tool to Aid in Process Analysis for Pollution Prevention

    EPA Science Inventory

    The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...

  13. AngioLab--a software tool for morphological analysis and endovascular treatment planning of intracranial aneurysms.

    PubMed

    Larrabide, Ignacio; Villa-Uriol, Maria-Cruz; Cárdenes, Rubén; Barbarito, Valeria; Carotenuto, Luigi; Geers, Arjan J; Morales, Hernán G; Pozo, José M; Mazzeo, Marco D; Bogunović, Hrvoje; Omedas, Pedro; Riccobene, Chiara; Macho, Juan M; Frangi, Alejandro F

    2012-11-01

    Determining whether and how an intracranial aneurysm should be treated is a tough decision that clinicians face everyday. Emerging computational tools could help clinicians analyze clinical data and make these decisions. AngioLab is a single graphical user interface, developed on top of the open source framework GIMIAS, that integrates some of the latest image analysis and computational modeling tools for intracranial aneurysms. Two workflows are available: Advanced Morphological Analysis (AMA) and Endovascular Treatment Planning (ETP). AngioLab has been evaluated by a total of 62 clinicians, who considered the information provided by AngioLab relevant and meaningful. They acknowledged the emerging need of these type of tools and the potential impact they might have on the clinical decision-making process.

  14. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    NASA Technical Reports Server (NTRS)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  15. Fill My Datebook: a software tool to generate and handle lists of events.

    PubMed

    Lewejohann, Lars

    2008-05-01

    Electronic calendars, and especially Internet-based calendars, are becoming more and more popular. Their advantages over paper calendars include being able to easily share events with others, gain remote access, organize multiple calendars, and receive visible and audible reminders. Scientific experiments often include a huge number of events that have to be organized. Experimental schedules that follow a fixed scheme can be described as lists of events. The software application presented here allows for the easy generation, management, and storage of lists of events using the Internet-based application Google Calendar.

  16. A flexible software tool for temporally-precise behavioral control in Matlab.

    PubMed

    Asaad, Wael F; Eskandar, Emad N

    2008-09-30

    Systems and cognitive neuroscience depend on carefully designed and precisely implemented behavioral tasks to elicit the neural phenomena of interest. To facilitate this process, we have developed a software system that allows for the straightforward coding and temporally-reliable execution of these tasks in Matlab. We find that, in most cases, millisecond accuracy is attainable, and those instances in which it is not are usually related to predictable, programmed events. In this report, we describe the design of our system, benchmark its performance in a real-world setting, and describe some key features.

  17. The Facial Aesthetic index: An additional tool for assessing treatment need

    PubMed Central

    Sundareswaran, Shobha; Ramakrishnan, Ranjith

    2016-01-01

    Objectives: Facial Aesthetics, a major consideration in orthodontic diagnosis and treatment planning, may not be judged correctly and completely by simply analyzing dental occlusion or osseous structures. Despite this importance, there is no index to guarantee availability of treatment or prioritize patients based on their soft tissue treatment needs. Individuals having well-aligned teeth but unaesthetic convex profiles do not get included for treatment as per current malocclusion indices. The aim of this investigation is to develop an aesthetic index based on facial profiles which could be used as an additional tool with malocclusion indices. Materials and Methods: A chart showing typical facial profile changes due to underlying malocclusions was generated by soft tissue manipulations of standardized profile photographs of a well-balanced male and female face. A panel of 62 orthodontists judged the profile photographs of 100 patients with different soft tissue patterns for assessing profile variations and treatment need. The index was later tested in a cross-section of school population. Statistical analysis was done using “irr” package of R environment version 2.15.1. Results: The index exhibited very good reliability in determining profile variations (Fleiss kappa 0.866, P < 0.001), excellent reproducibility (kappa 0.9078), high sensitivity, and specificity (95.7%). Testing in population yielded excellent agreement among orthodontists (kappa 0.9286). Conclusions: A new Facial Aesthetic index, based on patient's soft tissue profile requirements is proposed, which can complement existing indices to ensure treatment to those in need. PMID:27127752

  18. Blogs and Wikis as Instructional Tools: A Social Software Adaptation of Just-in-Time Teaching

    ERIC Educational Resources Information Center

    Higdon, Jude; Topaz, Chad

    2009-01-01

    Just-in-Time Teaching (JiTT) methodology uses Web-based tools to gather student responses to questions on preclass reading assignments. However, the technological requirements of JiTT and the content-specific nature of the questions may prevent some instructors from implementing it. Our own JiTT implementation uses publicly and freely available…

  19. Final Report "CoDeveloper: A Secure Web-Invocable Collaborative Software Development Tool"

    SciTech Connect

    Svetlana Shasharina

    2005-11-27

    Modern scientific simulations generate large datasets at remote sites with appropriate resources (supercomputers and clusters). Bringing these large datasets to the computers of all members of a distributed team of collaborators is often impractical or even impossible: there might not be enough bandwidth, storage capacity or appropriate data analysis and visualization tools locally available. To address the need to access remote data, avoid heavy Internet traffic and unnecessary data replication, Tech-X Corporation developed a tool, which allows running remote data visualization collaboratively and sharing the visualization objects as they get generated. The size of these objects is typically much smaller than the size of the original data. For marketing reasons, we renamed the product CoReViz. The detailed information on this product can be found at http://www.txcorp.com/products/CoReViz/. We installed and tested this tool at multiple machines at Tech-X and on seaborg at NERSC. In what follows, we give a detailed description of this tool.

  20. Word-Tool Match. Review Software for Basic CHOICE. CHOICE (Challenging Options in Career Education).

    ERIC Educational Resources Information Center

    Pitts, Ilse M.; And Others

    CHOICE Word-Tool Match provides migrant youth the opportunity to use the computer in self-directed ways, while reinforcing job and role information presented in Basic Job and Role activity folders and workbooks. Learners select whether to play with one or two players, the career that will provide the theme for the game, and whether to play the…

  1. The Viability of a Software Tool to Assist Students in the Review of Literature

    ERIC Educational Resources Information Center

    Anderson, Timothy R.

    2013-01-01

    Most doctoral students are novice researchers and may not possess the skills to effectively conduct a comprehensive review of the literature and frame a problem designed to conduct original research. Students need proper training and tools necessary to critically evaluate, synthesize and organize literature. The purpose of this concurrent mixed…

  2. Tools for Teaching Change Management: The Matrix of Change and Supporting Software.

    ERIC Educational Resources Information Center

    Brynjolfsson, Erik; van Alstyne, Marshall; Bernstein, Abraham; Renshaw, Amy Austin

    This paper presents recent developments in provision of support tools for change management and explains how they have been effectively used for teaching students about information technology (IT)-enabled change management in the core IT classes at MIT (Massachusetts Institute of Technology) and Stanford University (California). It also describes…

  3. Software Solutions for ICME

    NASA Astrophysics Data System (ADS)

    Schmitz, G. J.; Engstrom, A.; Bernhardt, R.; Prahl, U.; Adam, L.; Seyfarth, J.; Apel, M.; de Saracibar, C. Agelet; Korzhavyi, P.; Ågren, J.; Patzak, B.

    2016-01-01

    The Integrated Computational Materials Engineering expert group (ICMEg), a coordination activity of the European Commission, aims at developing a global and open standard for information exchange between the heterogeneous varieties of numerous simulation tools. The ICMEg consortium coordinates respective developments by a strategy of networking stakeholders in the first International Workshop on Software Solutions for ICME, compiling identified and relevant software tools into the Handbook of Software Solutions for ICME, discussing strategies for interoperability between different software tools during a second (planned) international workshop, and eventually proposing a scheme for standardized information exchange in a future book or document. The present article summarizes these respective actions to provide the ICME community with some additional insights and resources from which to help move this field forward.

  4. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  5. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    NASA Astrophysics Data System (ADS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  6. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  7. An assessment of a software simulation tool for lidar atmosphere and ocean measurements

    NASA Astrophysics Data System (ADS)

    Powell, K. A.; Vaughan, M.; Burton, S. P.; Hair, J. W.; Hostetler, C. A.; Kowch, R. S.

    2014-12-01

    A high-fidelity lidar simulation tool is used to generate synthetic lidar backscatter data that closely matches the expected performance of various lidars, including the noise characteristics inherent to analog detection and uncertainties related to the measurement environment. This tool supports performance trade studies and scientific investigations for both the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP), which flies aboard Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and the NASA Langley Research Center airborne High Spectral Resolution Lidar (HSRL). CALIOP measures profiles of attenuated backscatter coefficients (532 and 1064 nm) and volume depolarization ratios at 532 nm. HSRL measures the same profiles plus volume depolarization at 1064 nm and a molecular-only profile which allows for the direct retrieval of aerosol extinction and backscatter profiles at 532 nm. The simulation tool models both the fundamental physics of the lidar instruments and the signals generated from aerosols, clouds, and the ocean surface and subsurface. This work presents the results of a study conducted to verify the accuracy of the simulated data using data from both HSRL and CALIOP. The tool was tuned to CALIOP instrument settings and the model atmosphere was defined using profiles of attenuated backscatter and depolarization obtained by HSRL during underflights of CALIPSO. The validated HSRL data provide highly accurate measurements of the particulate intensive and extensive optical properties and thus were considered as the truth atmosphere. The resulting simulated data were processed through the CALIPSO data analysis system. Comparisons showed good agreement between the simulated and CALIOP data. This verifies the accuracy of the tool to support studies involving the characterization of instrument components and advanced data analysis techniques. The capability of the tool to simulate ocean surface scattering and subsurface

  8. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  9. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    SciTech Connect

    Chen, G; Ahunbay, E; Li, X

    2015-06-15

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during the actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool

  10. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  11. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    NASA Technical Reports Server (NTRS)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  12. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare

    PubMed Central

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient’s perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up. PMID:24199051

  13. Using Teamcenter engineering software for a successive punching tool lifecycle management

    NASA Astrophysics Data System (ADS)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  14. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    ERIC Educational Resources Information Center

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  15. Automated image mosaics by non-automated light microscopes: the MicroMos software tool.

    PubMed

    Piccinini, F; Bevilacqua, A; Lucarelli, E

    2013-12-01

    Light widefield microscopes and digital imaging are the basis for most of the analyses performed in every biological laboratory. In particular, the microscope's user is typically interested in acquiring high-detailed images for analysing observed cells and tissues, meanwhile being representative of a wide area to have reliable statistics. The microscopist has to choose between higher magnification factor and extension of the observed area, due to the finite size of the camera's field of view. To overcome the need of arrangement, mosaicing techniques have been developed in the past decades for increasing the camera's field of view by stitching together more images. Nevertheless, these approaches typically work in batch mode and rely on motorized microscopes. Or alternatively, the methods are conceived just to provide visually pleasant mosaics not suitable for quantitative analyses. This work presents a tool for building mosaics of images acquired with nonautomated light microscopes. The method proposed is based on visual information only and the mosaics are built by incrementally stitching couples of images, making the approach available also for online applications. Seams in the stitching regions as well as tonal inhomogeneities are corrected by compensating the vignetting effect. In the experiments performed, we tested different registration approaches, confirming that the translation model is not always the best, despite the fact that the motion of the sample holder of the microscope is apparently translational and typically considered as such. The method's implementation is freely distributed as an open source tool called MicroMos. Its usability makes building mosaics of microscope images at subpixel accuracy easier. Furthermore, optional parameters for building mosaics according to different strategies make MicroMos an easy and reliable tool to compare different registration approaches, warping models and tonal corrections.

  16. FlowCal: A User-Friendly, Open Source Software Tool for Automatically Converting Flow Cytometry Data from Arbitrary to Calibrated Units.

    PubMed

    Castillo-Hair, Sebastian M; Sexton, John T; Landry, Brian P; Olson, Evan J; Igoshin, Oleg A; Tabor, Jeffrey J

    2016-07-15

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, nonproprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae Venus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond.

  17. Regional Economic Accounting (REAcct). A software tool for rapidly approximating economic impacts

    SciTech Connect

    Ehlen, Mark Andrew; Vargas, Vanessa N.; Loose, Verne William; Starks, Shirley J.; Ellebracht, Lory A.

    2011-07-01

    This paper describes the Regional Economic Accounting (REAcct) analysis tool that has been in use for the last 5 years to rapidly estimate approximate economic impacts for disruptions due to natural or manmade events. It is based on and derived from the well-known and extensively documented input-output modeling technique initially presented by Leontief and more recently further developed by numerous contributors. REAcct provides county-level economic impact estimates in terms of gross domestic product (GDP) and employment for any area in the United States. The process for using REAcct incorporates geospatial computational tools and site-specific economic data, permitting the identification of geographic impact zones that allow differential magnitude and duration estimates to be specified for regions affected by a simulated or actual event. Using these data as input to REAcct, the number of employees for 39 directly affected economic sectors (including 37 industry production sectors and 2 government sectors) are calculated and aggregated to provide direct impact estimates. Indirect estimates are then calculated using Regional Input-Output Modeling System (RIMS II) multipliers. The interdependent relationships between critical infrastructures, industries, and markets are captured by the relationships embedded in the inputoutput modeling structure.

  18. ASSET: a software tool for the evaluation of manoeuvre capabilities of highly agile satellites

    NASA Astrophysics Data System (ADS)

    Barschke, Merlin F.; Levenhagen, Jens; Reggio, Domenico; Roberts, Peter C. E.

    2014-03-01

    The new generation of agile earth observation satellites provides much higher observation capabilities than their non-agile predecessors. From a kinematic point of view, these capabilities result in more complex guidance laws for the spacecraft's attitude control system. The computation of these guidance laws is driven by a number of factors. For instance, the Earth's curved shape and its rotation in combination with the possible scan path geometries lead to a highly nonlinear relation between the motion of the satellite and the line-of-sight projection onto Earth. In this paper ASSET (Agile Satellites Scenario Evaluation Tool) is presented. ASSET is a modular MATLAB command line tool developed at Astrium GmbH, Germany, to asses the manoeuvre capabilities of agile satellites carrying time-delayed integration instruments. Each single scenario may consist of one or several ground scans, linked by suitable spacecraft slews. Once the entire scenario is defined, ASSET will analyse whether the kinematic and dynamic constraints of a specific satellite allow this scenario to be performed and will then generate the related guidance profile (angles and angular rates). The satellites' ground track, the projection of the instruments line-of-sight, and the projection of the instruments field of view onto the earth can be plotted for a visual inspection. ASSET can perform the analysis of scenarios with several different scan modes usually performed by this type of satellite.

  19. CAGO: a software tool for dynamic visual comparison and correlation measurement of genome organization.

    PubMed

    Chang, Yi-Feng; Chang, Chuan-Hsiung

    2011-01-01

    CAGO (Comparative Analysis of Genome Organization) is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG) format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago.

  20. Building an infrastructure at PICKSC for the educational use of kinetic software tools

    NASA Astrophysics Data System (ADS)

    Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; Amorim, L. D.; An, W.; Dalichaouch, T. N.; Davidson, A.; Joglekar, A.; Li, F.; May, J.; Touati, M.; Xu, X. L.; Yu, P.

    2016-10-01

    One aim of the Particle-In-Cell and Kinetic Simulation Center (PICKSC) at UCLA is to coordinate a community development of educational software for undergraduate and graduate courses in plasma physics and computer science. The rich array of physical behaviors exhibited by plasmas can be difficult to grasp by students. If they are given the ability to quickly and easily explore plasma physics through kinetic simulations, and to make illustrative visualizations of plasma waves, particle motion in electromagnetic fields, instabilities, or other phenomena, then they can be equipped with first-hand experiences that inform and contextualize conventional texts and lectures. We are developing an infrastructure for any interested persons to take our kinetic codes, run them without any prerequisite knowledge, and explore desired scenarios. Furthermore, we are actively interested in any ideas or input from other plasma physicists. This poster aims to illustrate what we have developed and gather a community of interested users and developers. Supported by NSF under Grant ACI-1339893.

  1. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    NASA Astrophysics Data System (ADS)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  2. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  3. A Cognitive Tool for Teaching the Addition/Subtraction of Common Fractions: A Model of Affordances

    ERIC Educational Resources Information Center

    Kong, Siu Cheung; Kwok, Lam For

    2005-01-01

    The aim of this research is to devise a cognitive tool for meeting the diverse needs of learners for comprehending new procedural knowledge. A model of affordances on teaching fraction equivalence for developing procedural knowledge for adding/subtracting fractions with unlike denominators was derived from the results of a case study of an initial…

  4. Software tools and preliminary design of a control system for the 40m OAN radiotelescope

    NASA Astrophysics Data System (ADS)

    de Vicente, P.; Bolaño, R.

    2004-07-01

    The Observatorio Astronómico Nacional (OAN) is building a 40m radiotelescope in its facilities in Yebes (Spain) which will be delivered by April 2004. The servosystem will be controlled by an ACU (Antenna Control Unit), a real time computer running VxWorks which will be commanded from a remote computer (RCC) or from a local computer (LCC) which will act as console. We present the tools we have chosen to develop and use the control system for the RCC and the criteria followed for the choices we made. We also present a preliminary design of the control system on which we are currently working. The RCC will run a server which communicates with the ACU using sockets and with the clients, receivers and backends using OmniOrb, a free implementation of CORBA. Clients running Python will allow the users to control the antenna from any host connected to a LAN or a secure Internet connection.

  5. Exon array data analysis using Affymetrix power tools and R statistical software

    PubMed Central

    2011-01-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform. PMID:21498550

  6. Exon array data analysis using Affymetrix power tools and R statistical software.

    PubMed

    Lockstone, Helen E

    2011-11-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform.

  7. MAAC: a software tool for user authentication and access control to the electronic patient record in an open distributed environment

    NASA Astrophysics Data System (ADS)

    Motta, Gustavo H.; Furuie, Sergio S.

    2004-04-01

    Designing proper models for authorization and access control for the electronic patient record (EPR) is essential to wide scale use of the EPR in large health organizations. This work presents MAAC (Middleware for Authentication and Access Control), a tool that implements a contextual role-based access control (RBAC) authorization model. RBAC regulates user"s access to computers resources based on their organizational roles. A contextual authorization uses environmental information available at access-request time, like user/patient relationship, in order to decide whether a user has the right to access an EPR resource. The software architecture where MAAC is implemented uses Lightweight Directory Access Protocol, Java programming language and the CORBA/OMG standards CORBA Security Service and Resource Access Decision Facility. With those open and distributed standards, heterogeneous EPR components can request user authentication and access authorization services in a unified and consistent fashion across multiple platforms.

  8. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  9. cdfs-sim, cdfs-extract, LFtools: new software tools for XMM-Newton and other missions

    NASA Astrophysics Data System (ADS)

    Ranalli, P.

    2014-07-01

    With the increasing size and complexity of data in modern astrophysics, software is playing a major role among the astronomer's tools. The public availability of code is key to allow a faster advancement of science, and to guarantee the reproducibility of published results. In this poster I will present a collection of programs which I have been developing as part of my research, which are being successfully used by different groups for their publications (XMM-CDFS, Stripe-82, XXL), and which I have publicly released as free software. While currently tuned to XMM-Newton, all of them are extensible to other missions. The list includes: cdfs-sim: a simulator of X-ray astronomical observations. It can simulate an arbitrary set of point sources and reproduce the XMM-Newton background, giving an event file which can be analyzed with SAS. cdfs-extract: a program to extract spectra for multiple sources in multiple XMM-Newton observations. LFtools: a set of programs to compute luminosity functions, with binned estimates, maximum likelihood fits and Bayesian parameter exploration.

  10. Regulatory use of computational toxicology tools and databases at the United States Food and Drug Administration's Office of Food Additive Safety.

    PubMed

    Arvidson, Kirk B; Chanderbhan, Ronald; Muldoon-Jacobs, Kristi; Mayer, Julie; Ogungbesan, Adejoke

    2010-07-01

    Over 10 years ago, the Office of Food Additive Safety (OFAS) in the FDA's Center for Food Safety and Applied Nutrition implemented the formal use of structure-activity relationship analysis and quantitative structure-activity relationship (QSAR) analysis in the premarket review of food-contact substances. More recently, OFAS has implemented the use of multiple QSAR software packages and has begun investigating the use of metabolism data and metabolism predictive models in our QSAR evaluations of food-contact substances. In this article, we provide an overview of the programs used in OFAS as well as a perspective on how to apply multiple QSAR tools in the review process of a new food-contact substance.

  11. The DSET Tool Library: A software approach to enable data exchange between climate system models

    SciTech Connect

    McCormick, J.

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  12. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  13. Fault Injection Software Tools and Robust Design Principles for Reliability and Safety in Measurement Science Education

    NASA Astrophysics Data System (ADS)

    Faller, Lisa-Marie; Zangl, Hubert; Leitzke, Juliana P.

    2016-11-01

    In the design of measurement systems we face the fact that parameters are subject to (measurement-) uncertainties. Additionally, components may behave entirely different from what is specified, which is then considered a fault. Consequently, both uncertainty as well as probability of failure should be considered in education on robust design and reliability. In this paper we present a teaching concept based on hardware fault injection using a simple level sensor system as an example. Learning objectives are faults, errors, failures, false alarms versus misses as well as advantages and disadvantages of redundancy.

  14. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  15. The Liege Acromegaly Survey (LAS): a new software tool for the study of acromegaly.

    PubMed

    Petrossians, Patrick; Tichomirowa, Maria A; Stevenaert, Achile; Martin, Didier; Daly, Adrian F; Beckers, Albert

    2012-06-01

    Acromegaly is a chronic rare disease associated with negative pathological effects on multiple systems and organs. We designed a new informatics tool to study data from patients with acromegaly, the Liege Acromegaly Survey (LAS). This relational database permits the inclusion of anonymous historical and prospective data on patients and includes pathophysiology, clinical features, responses to therapy and long term outcomes of acromegaly. We deployed the LAS in a validation study at a single center in order to study the characteristics of patients with acromegaly diagnosed at our center from 1970-2011. A total of 290 patients with acromegaly were included (147 males and 143 females). There was a linear relationship between age at diagnosis and the date of diagnosis, indicating that older patients are being diagnosed with acromegaly more frequently. A majority presented with macroadenomas (77.5%) and the median diameter was 14 mm. Patients with macroadenomas were significantly younger than patients with microadenomas (P=0.01). GH values at diagnosis decreased with the age of the patients (P=0.01) and there was a correlation between GH values and tumor size at diagnosis (P=0.02). No correlation existed between insulin-like growth factor 1 (IGF-1) levels and tumor characteristics. The prevalence of diabetes was 21.4% in this population and 41.0% had hypertension. The presence of hypertension and diabetes were significantly associated with one another (P<0.001). There was a linear relation between initial GH and IGF-1 levels at diagnosis and those obtained during SSA analog treatment and the lowest GH and IGF-1 values following SSA therapy were obtained in older patients (GH: P<0.001; IGF-1: P<0.001). The LAS is a new relational database that is feasible to use in the clinical research setting and permits ready pooling of anonymous patient data from multiple study sites to undertake robust statistical analyses of clinical and therapeutic characteristics.

  16. Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications

    PubMed Central

    D’Souza, Malcolm J.; Kashmar, Richard J.; Hurst, Kent; Fiedler, Frank; Gross, Catherine E.; Deol, Jasbir K.; Wilson, Alora

    2015-01-01

    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004–2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average. PMID:26191337

  17. UMMPerfusion: an open source software tool towards quantitative MRI perfusion analysis in clinical routine.

    PubMed

    Zöllner, Frank G; Weisser, Gerald; Reich, Marcel; Kaiser, Sven; Schoenberg, Stefan O; Sourbron, Steven P; Schad, Lothar R

    2013-04-01

    To develop a generic Open Source MRI perfusion analysis tool for quantitative parameter mapping to be used in a clinical workflow and methods for quality management of perfusion data. We implemented a classic, pixel-by-pixel deconvolution approach to quantify T1-weighted contrast-enhanced dynamic MR imaging (DCE-MRI) perfusion data as an OsiriX plug-in. It features parallel computing capabilities and an automated reporting scheme for quality management. Furthermore, by our implementation design, it could be easily extendable to other perfusion algorithms. Obtained results are saved as DICOM objects and directly added to the patient study. The plug-in was evaluated on ten MR perfusion data sets of the prostate and a calibration data set by comparing obtained parametric maps (plasma flow, volume of distribution, and mean transit time) to a widely used reference implementation in IDL. For all data, parametric maps could be calculated and the plug-in worked correctly and stable. On average, a deviation of 0.032 ± 0.02 ml/100 ml/min for the plasma flow, 0.004 ± 0.0007 ml/100 ml for the volume of distribution, and 0.037 ± 0.03 s for the mean transit time between our implementation and a reference implementation was observed. By using computer hardware with eight CPU cores, calculation time could be reduced by a factor of 2.5. We developed successfully an Open Source OsiriX plug-in for T1-DCE-MRI perfusion analysis in a routine quality managed clinical environment. Using model-free deconvolution, it allows for perfusion analysis in various clinical applications. By our plug-in, information about measured physiological processes can be obtained and transferred into clinical practice.

  18. Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications.

    PubMed

    D'Souza, Malcolm J; Kashmar, Richard J; Hurst, Kent; Fiedler, Frank; Gross, Catherine E; Deol, Jasbir K; Wilson, Alora

    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004-2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average.

  19. CoCoTools: open-source software for building connectomes using the CoCoMac anatomical database.

    PubMed

    Blumenfeld, Robert S; Bliss, Daniel P; Perez, Fernando; D'Esposito, Mark

    2014-04-01

    Neuroanatomical tracer studies in the nonhuman primate macaque monkey are a valuable resource for cognitive neuroscience research. These data ground theories of cognitive function in anatomy, and with the emergence of graph theoretical analyses in neuroscience, there is high demand for these data to be consolidated into large-scale connection matrices ("macroconnectomes"). Because manual review of the anatomical literature is time consuming and error prone, computational solutions are needed to accomplish this task. Here we describe the "CoCoTools" open-source Python library, which automates collection and integration of macaque connectivity data for visualization and graph theory analysis. CoCoTools both interfaces with the CoCoMac database, which houses a vast amount of annotated tracer results from 100 years (1905-2005) of neuroanatomical research, and implements coordinate-free registration algorithms, which allow studies that use different parcellations of the brain to be translated into a single graph. We show that using CoCoTools to translate all of the data stored in CoCoMac produces graphs with properties consistent with what is known about global brain organization. Moreover, in addition to describing CoCoTools' processing pipeline, we provide worked examples, tutorials, links to on-line documentation, and detailed appendices to aid scientists interested in using CoCoTools to gather and analyze CoCoMac data.

  20. An automatic approach for calibrating dielectric bone properties by combining finite-element and optimization software tools.

    PubMed

    Su, Yukun; Kluess, Daniel; Mittelmeier, Wolfram; van Rienen, Ursula; Bader, Rainer

    2016-09-01

    The dielectric properties of human bone are one of the most essential inputs required by electromagnetic stimulation for improved bone regeneration. Measuring the electric properties of bone is a difficult task because of the complexity of the bone structure. Therefore, an automatic approach is presented to calibrate the electric properties of bone. The numerical method consists of three steps: generating input from experimental data, performing the numerical simulation, and calibrating the bone dielectric properties. As an example, the dielectric properties at 20 Hz of a rabbit distal femur were calibrated. The calibration process was considered as an optimization process with the aim of finding the optimum dielectric bone properties that match most of the numerically calculated simulation and experimentally measured data sets. The optimization was carried out automatically by the optimization software tool iSIGHT in combination with the finite-element solver COMSOL Multiphysics. As a result, the optimum conductivity and relative permittivity of the rabbit distal femur at 20 Hz were found to be 0.09615 S/m and 19522 for cortical bone and 0.14913 S/m and 1561507 for cancellous bone, respectively. The proposed method is a potential tool for the identification of realistic dielectric properties of the entire bone volume. The presented approach combining iSIGHT with COMSOL is applicable to, amongst others, designing implantable electro-stimulative devices or the optimization of electrical stimulation parameters for improved bone regeneration.

  1. Results of a Survey Software Development Project Management in the U.S. Aerospace Industry. Volume II. Project Management Techniques, Procedures and Tools.

    DTIC Science & Technology

    1979-12-18

    PROJECT MANAGEMENT IN THE U.S. AEROSPACE INDUSTRY Volume I1 PROJECT MANAGEMENT TECHNIQUES, PROCEDURES AND TOOLS RICHARD Hf. THAYER SACRAMENTO AIR...MANAGEMENT TECHNIQUES AND PROCEDURES USED IN SOFTWARE DEVELOPMENT PROJECTS BY THE US AEROSPACE INDUSTRY BY Richard H. Thayer and John H. Lehman This report...contains the results of a survey conducted in 1977 and 1978 on how the US Aerospace Industry manages its software development projects. The sample of

  2. Gmat. A software tool for the computation of the rovibrational G matrix

    NASA Astrophysics Data System (ADS)

    Castro, M. E.; Niño, A.; Muñoz-Caro, C.

    2009-07-01

    . In addition, the program should handle the large number of files generated in massive explorations of molecular potential energy hypersurfaces. In these cases, Gmat will provide the G matrix as a function of the molecular structure. Solution method: To reach its objectives, Gmat has been organized in two components: an interface and a functional part. This organization allows for separating the input/output tasks, which are dependent on the human-machine interaction model selected, from the functional requirements, which are not. An object-oriented approach has been used in both parts. In the interface, polymorphism is used to allow the data acquisition from output files of different electronic structure codes. In the functional part, Gmat computes numerically the derivatives of the Cartesian coordinates respect to the vibrational coordinates needed to build the G matrix. Extremely accurate numerical derivatives are obtained in a double procedure. First, the truncation plus roundoff errors are minimized in the central differences expression. Second, the result is embedding in a nine levels Richardson extrapolation process. In the present version, the program allows the use of internal coordinates as vibrational coordinates, with the principal axes of inertia as body-fixed system. Running time: Sample test runs provided with the distribution take a few seconds to execute.

  3. V3 net charge: additional tool in HIV-1 tropism prediction.

    PubMed

    Montagna, Claudia; De Crignis, Elisa; Bon, Isabella; Re, Maria Carla; Mezzaroma, Ivano; Turriziani, Ombretta; Graziosi, Cecilia; Antonelli, Guido

    2014-12-01

    Genotype-based algorithms are valuable tools for the identification of patients eligible for CCR5 inhibitors administration in clinical practice. Among the available methods, geno2pheno[coreceptor] (G2P) is the most used online tool for tropism prediction. This study was conceived to assess if the combination of G2P prediction with V3 peptide net charge (NC) value could improve the accuracy of tropism prediction. A total of 172 V3 bulk sequences from 143 patients were analyzed by G2P and NC values. A phenotypic assay was performed by cloning the complete env gene and tropism determination was assessed on U87_CCR5(+)/CXCR4(+) cells. Sequences were stratified according to the agreement between NC values and G2P results. Of sequences predicted as X4 by G2P, 61% showed NC values higher than 5; similarly, 76% of sequences predicted as R5 by G2P had NC values below 4. Sequences with NC values between 4 and 5 were associated with different G2P predictions: 65% of samples were predicted as R5-tropic and 35% of sequences as X4-tropic. Sequences identified as X4 by NC value had at least one positive residue at positions known to be involved in tropism prediction and positive residues in position 32. These data supported the hypothesis that NC values between 4 and 5 could be associated with the presence of dual/mixed-tropic (DM) variants. The phenotypic assay performed on a subset of sequences confirmed the tropism prediction for concordant sequences and showed that NC values between 4 and 5 are associated with DM tropism. These results suggest that the combination of G2P and NC could increase the accuracy of tropism prediction. A more reliable identification of X4 variants would be useful for better selecting candidates for Maraviroc (MVC) administration, but also as a predictive marker in coreceptor switching, strongly associated with the phase of infection.

  4. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    NASA Astrophysics Data System (ADS)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    input structure (e.g., a checkerboard pattern) will be resolved by data for different types of earthquake-receiver geometries. Additionally, Larry3D, a three-dimensional seismic tomography tool contributed by Boschi, and NonLinLoc, a nonlinear earthquake relocation tool by Anthony Lomax, are both under development. The goal of all of the implemented modules is to aid in teaching research techniques, while remaining flexible enough for use in true research applications. In the long run, SEATREE may contribute to new ways of sharing scientific research, making published (numerical) experiments truly reproducible again. SEATREE can be downloaded as a package from http://geosys.usc.edu/projects/seatree/wiki/, and users can also subscribe to our Subversion project page. The software is designed to run on GNU/Linux based platforms and has also been successfully run on Mac OS-X. Our poster will present the four currently implemented modules, along with our design philosophies and implementation details.

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  6. Developing a Generic Risk Assessment Simulation Modelling Software Tool for Assessing the Risk of Foot and Mouth Virus Introduction.

    PubMed

    Tameru, B; Gebremadhin, B; Habtemariam, T; Nganwa, D; Ayanwale, O; Wilson, S; Robnett, V; Wilson, W

    2008-06-01

    Foot and Mouth disease (FMD) is a highly contagious viral disease that affects all cloven-hoofed animals. Because of its devastating effects on the agricultural industry, many countries take measures to stop the introduction of FMD virus into their countries. Decision makers at multiple levels of the United States Department of Agriculture (USDA) use Risk Assessments (RAs) (both quantitative and qualitative) to make better and more informed scientifically based decisions to prevent the accidental or intentional introduction of the disease. There is a need for a generic RA that can be applied to any country (whether FMD free or non-FMD free) and for any product (FMD infected animals and animal products). We developed a user-friendly generic RA tool (software) that can be used to conduct and examine different scenarios of quantitative/qualitative risk assessments for the different countries with their varying FMD statuses in relation to reintroduction of FMD virus into the USA. The program was written in Microsoft Visual Basic 6.0 (Microsoft Corporation, Redmond, Washington, USA). The @Risk 6.1 Developer Kit (RDK) and @Risk 6.1 Best Fit Kit library (Palisade Corporation, Newfield, NY.USA) was used to build Monte Carlo simulation models. Microsoft Access 2000 (Microsoft Corporation, Redmond, Washington, USA) was used and SQL to query the data. Different input probability distributions can be selected for the nodes in the scenario tree and different output for each end-state of the simulation is given in different graphical formats and statistical values are used in describing the likelihood of FMD virus introduction. Sensitivity Analysis in determining which input factor has more effect on the total risk outputs is also given. The developed generic RA tools can be eventually extended and modified to conduct RAs for other animal diseases and animal products.

  7. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    NASA Astrophysics Data System (ADS)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  8. Software tools -- Man pages

    SciTech Connect

    1996-02-01

    Name, availability, synopsis, description, example, release and last change date are given for each of the following computer codes: DBLOADTEMPLATE(1); GDCT(1); SF2DB(1); SUBTOOL(1); DBLOADRECORDS(3); epvxiMsgLib(1); epvxiLib(1); freeList(1); gpHash(1); DBDATABASE(1); DBFILE(5); and TEMPLATEFILE(5).

  9. Hidden Uses of Presentation Software--The Ideal Tool for Making Customized Materials for Special Needs Students and Clients.

    ERIC Educational Resources Information Center

    Gilden, Deborah

    This paper discusses how presentation software can be used to design custom materials for a variety of people with special needs, including children and adults with low vision, people with developmental disabilities, and stroke patients with cognitive impairments. Benefits of using presentation software include: (1) presentation software gives the…

  10. WASI-2D: A software tool for regionally optimized analysis of imaging spectrometer data from deep and shallow waters

    NASA Astrophysics Data System (ADS)

    Gege, Peter

    2014-01-01

    An image processing software has been developed which allows quantitative analysis of multi- and hyperspectral data from oceanic, coastal and inland waters. It has been implemented into the Water Colour Simulator WASI, which is a tool for the simulation and analysis of optical properties and light field parameters of deep and shallow waters. The new module WASI-2D can import atmospherically corrected images from airborne sensors and satellite instruments in various data formats and units like remote sensing reflectance or radiance. It can be easily adapted by the user to different sensors and to optical properties of the studied area. Data analysis is done by inverse modelling using established analytical models. The bio-optical model of the water column accounts for gelbstoff (coloured dissolved organic matter, CDOM), detritus, and mixtures of up to 6 phytoplankton classes and 2 spectrally different types of suspended matter. The reflectance of the sea floor is treated as sum of up to 6 substrate types. An analytic model of downwelling irradiance allows wavelength dependent modelling of sun glint and sky glint at the water surface. The provided database covers the spectral range from 350 to 1000 nm in 1 nm intervals. It can be exchanged easily to represent the optical properties of water constituents, bottom types and the atmosphere of the studied area.

  11. Using a Software Tool in Forecasting: a Case Study of Sales Forecasting Taking into Account Data Uncertainty

    NASA Astrophysics Data System (ADS)

    Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter

    2016-10-01

    Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.

  12. Software attribute visualization for high integrity software

    SciTech Connect

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  13. SU-E-T-211: Comparison of Seven New TrueBeam Linacs with Enhanced Beam Data Conformance Using a Beam Comparison Software Tool

    SciTech Connect

    Grzetic, S; Hessler, J; Gupta, N; Woollard, J; DiCostanzo, D; Ayan, A; Carlson, M

    2015-06-15

    Purpose: To develop an independent software tool to assist in commissioning linacs with enhanced beam conformance, as well as perform ongoing QA for dosimetrically equivalent linacs. Methods: Linac manufacturers offer enhanced beam conformance as an option to allow for clinics to complete commissioning efficiently, as well as implement dosimetrically equivalent linacs. The specification for enhanced conformance includes PDD as well as profiles within 80% FWHM. Recently, we commissioned seven Varian TrueBeam linacs with enhanced beam conformance. We developed a software tool in Visual Basic to allow us to load the reference beam data and compare our beam data during commissioning to evaluate enhanced beam conformance. This tool also allowed us to upload our beam data used for commissioning our dosimetrically equivalent beam models to compare and tweak each of our linac beams to match our modelled data in Varian’s Eclipse TPS. This tool will also be used during annual QA of the linacs to compare our beam data to our baseline data, as required by TG-142. Results: Our software tool was used to check beam conformance for seven TrueBeam linacs that we commissioned in the past six months. Using our tool we found that the factory conformed linacs showed up to 3.82% difference in their beam profile data upon installation. Using our beam comparison tool, we were able to adjust the energy and profiles of our beams to accomplish a better than 1.00% point by point data conformance. Conclusion: The availability of quantitative comparison tools is essential to accept and commission linacs with enhanced beam conformance, as well as to beam match multiple linacs. We further intend to use the same tool to ensure our beam data conforms to the commissioning beam data during our annual QA in keeping with the requirements of TG-142.

  14. BEESCOUT: A model of bee scouting behaviour and a software tool for characterizing nectar/pollen landscapes for BEEHAVE.

    PubMed

    Becher, M A; Grimm, V; Knapp, J; Horn, J; Twiston-Davies, G; Osborne, J L

    2016-11-24

    Social bees are central place foragers collecting floral resources from the surrounding landscape, but little is known about the probability of a scouting bee finding a particular flower patch. We therefore developed a software tool, BEESCOUT, to theoretically examine how bees might explore a landscape and distribute their scouting activities over time and space. An image file can be imported, which is interpreted by the model as a "forage map" with certain colours representing certain crops or habitat types as specified by the user. BEESCOUT calculates the size and location of these potential food sources in that landscape relative to a bee colony. An individual-based model then determines the detection probabilities of the food patches by bees, based on parameter values gathered from the flight patterns of radar-tracked honeybees and bumblebees. Various "search modes" describe hypothetical search strategies for the long-range exploration of scouting bees. The resulting detection probabilities of forage patches can be used as input for the recently developed honeybee model BEEHAVE, to explore realistic scenarios of colony growth and death in response to different stressors. In example simulations, we find that detection probabilities for food sources close to the colony fit empirical data reasonably well. However, for food sources further away no empirical data are available to validate model output. The simulated detection probabilities depend largely on the bees' search mode, and whether they exchange information about food source locations. Nevertheless, we show that landscape structure and connectivity of food sources can have a strong impact on the results. We believe that BEESCOUT is a valuable tool to better understand how landscape configurations and searching behaviour of bees affect detection probabilities of food sources. It can also guide the collection of relevant data and the design of experiments to close knowledge gaps, and provides a useful

  15. A Software Tool for Estimation of Burden of Infectious Diseases in Europe Using Incidence-Based Disability Adjusted Life Years

    PubMed Central

    Lewandowski, Daniel; Mangen, Marie-Josee J.; Plass, Dietrich; McDonald, Scott A.; van Lier, Alies; Haagsma, Juanita A.; Maringhini, Guido; Pini, Alessandro; Kramarz, Piotr; Kretzschmar, Mirjam E.

    2017-01-01

    The burden of disease framework facilitates the assessment of the health impact of diseases through the use of summary measures of population health such as Disability-Adjusted Life Years (DALYs). However, calculating, interpreting and communicating the results of studies using this methodology poses a challenge. The aim of the Burden of Communicable Disease in Europe (BCoDE) project is to summarize the impact of communicable disease in the European Union and European Economic Area Member States (EU/EEA MS). To meet this goal, a user-friendly software tool (BCoDE toolkit), was developed. This stand-alone application, written in C++, is open-access and freely available for download from the website of the European Centre for Disease Prevention and Control (ECDC). With the BCoDE toolkit, one can calculate DALYs by simply entering the age group- and sex-specific number of cases for one or more of selected sets of 32 communicable diseases (CDs) and 6 healthcare associated infections (HAIs). Disease progression models (i.e., outcome trees) for these communicable diseases were created following a thorough literature review of their disease progression pathway. The BCoDE toolkit runs Monte Carlo simulations of the input parameters and provides disease-specific results, including 95% uncertainty intervals, and permits comparisons between the different disease models entered. Results can be displayed as mean and median overall DALYs, DALYs per 100,000 population, and DALYs related to mortality vs. disability. Visualization options summarize complex epidemiological data, with the goal of improving communication and knowledge transfer for decision-making. PMID:28107447

  16. Recent Additions in the Modeling Capabilities of an Open-Source Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-04-20

    WEC-Sim is a midfidelity numerical tool for modeling wave energy conversion devices. The code uses the MATLAB SimMechanics package to solve multibody dynamics and models wave interactions using hydrodynamic coefficients derived from frequency-domain boundary-element methods. This paper presents the new modeling features introduced in the latest release of WEC-Sim. The first feature discussed conversion of the fluid memory kernel to a state-space form. This enhancement offers a substantial computational benefit after the hydrodynamic body-to-body coefficients are introduced and the number of interactions increases exponentially with each additional body. Additional features include the ability to calculate the wave-excitation forces based on the instantaneous incident wave angle, allowing the device to weathervane, as well as import a user-defined wave elevation time series. A review of the hydrodynamic theory for each feature is provided and the successful implementation is verified using test cases.

  17. An open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and industrial CAM software.

    PubMed

    Lu, Li; Liu, Shusheng; Shi, Shenggen; Yang, Jianzhong

    2011-10-01

    China-made 5-axis simultaneous contouring CNC machine tool and domestically developed industrial computer-aided manufacture (CAM) technology were used for full crown fabrication and measurement of crown accuracy, with an attempt to establish an open CAM system for dental processing and to promote the introduction of domestic dental computer-aided design (CAD)/CAM system. Commercially available scanning equipment was used to make a basic digital tooth model after preparation of crown, and CAD software that comes with the scanning device was employed to design the crown by using domestic industrial CAM software to process the crown data in order to generate a solid model for machining purpose, and then China-made 5-axis simultaneous contouring CNC machine tool was used to complete machining of the whole crown and the internal accuracy of the crown internal was measured by using 3D-MicroCT. The results showed that China-made 5-axis simultaneous contouring CNC machine tool in combination with domestic industrial CAM technology can be used for crown making and the crown was well positioned in die. The internal accuracy was successfully measured by using 3D-MicroCT. It is concluded that an open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and domestic industrial CAM software has been established, and development of the system will promote the introduction of domestically-produced dental CAD/CAM system.

  18. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  19. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  20. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits.

  1. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  2. The Online Bioinformatics Resources Collection at the University of Pittsburgh Health Sciences Library System—a one-stop gateway to online bioinformatics databases and software tools

    PubMed Central

    Chen, Yi-Bu; Chattopadhyay, Ansuman; Bergen, Phillip; Gadd, Cynthia; Tannery, Nancy

    2007-01-01

    To bridge the gap between the rising information needs of biological and medical researchers and the rapidly growing number of online bioinformatics resources, we have created the Online Bioinformatics Resources Collection (OBRC) at the Health Sciences Library System (HSLS) at the University of Pittsburgh. The OBRC, containing 1542 major online bioinformatics databases and software tools, was constructed using the HSLS content management system built on the Zope® Web application server. To enhance the output of search results, we further implemented the Vivísimo Clustering Engine®, which automatically organizes the search results into categories created dynamically based on the textual information of the retrieved records. As the largest online collection of its kind and the only one with advanced search results clustering, OBRC is aimed at becoming a one-stop guided information gateway to the major bioinformatics databases and software tools on the Web. OBRC is available at the University of Pittsburgh's HSLS Web site (). PMID:17108360

  3. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    NASA Astrophysics Data System (ADS)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  4. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  5. Methods and software tools for computer-aided design of the spacecraft guidance, navigation and control systems

    NASA Astrophysics Data System (ADS)

    Somov, Yevgeny; Oparin, Gennady

    2017-01-01

    We shortly present results on development and employment of the software systems for computer-aided design of the spacecraft guidance. navigation and control systems - modeling, synthesis, nonlinear analysis, simulation and graphic mapping of dynamic processes.

  6. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    SciTech Connect

    BRATZEL, D.R.

    2000-09-28

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks.

  7. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  8. User Acceptance of a Software Tool for Decision Making in IT Outsourcing: A Qualitative Study in Large Companies from Sweden

    NASA Astrophysics Data System (ADS)

    Andresen, Christoffer; Hodosi, Georg; Saprykina, Irina; Rusu, Lazar

    Decisions for IT outsourcing are very complex and needs to be supported by considerations based on many (multiple) criteria. In order to facilitate the use of a specific tool by a decision-maker in IT outsourcing, we need to find out whether such a tool for this purpose will be accepted or rejected or what improvements must be added to this tool to be accepted by some IT decision makers in large companies from Sweden.

  9. Risk Assessment Tools

    DTIC Science & Technology

    1994-10-01

    2W0 ww) A number of computer-based risk assessment tools were enhanced or creaited to provide Increased access to risk assessment instruments and...produced an extensible authoring tool , SYNTAS, for test instruments that will simplify the data gathering phase of subsequent work. SYNTAS gives DNA...Ultimately it became a computer-assisted software engineerting (CASE) tool capable of producing a wide variety of assessment instruments . In addition, its

  10. The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2009-01-01

    This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.

  11. TOWARD DEVELOPMENT OF A COMMON SOFTWARE APPLICATION PROGRAMMING INTERFACE (API) FOR UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION METHODS AND TOOLS

    EPA Science Inventory

    The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...

  12. Transana Qualitative Video and Audio Analysis Software as a Tool for Teaching Intellectual Assessment Skills to Graduate Psychology Students

    ERIC Educational Resources Information Center

    Rush, S. Craig

    2014-01-01

    This article draws on the author's experience using qualitative video and audio analysis, most notably through use of the Transana qualitative video and audio analysis software program, as an alternative method for teaching IQ administration skills to students in a graduate psychology program. Qualitative video and audio analysis may be useful for…

  13. Software distribution using xnetlib

    SciTech Connect

    Dongarra, J.J. |; Rowan, T.H.; Wade, R.C.

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  14. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    PubMed

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase.

  15. A Serious Videogame as an Additional Therapy Tool for Training Emotional Regulation and Impulsivity Control in Severe Gambling Disorder

    PubMed Central

    Tárrega, Salomé; Castro-Carreras, Laia; Fernández-Aranda, Fernando; Granero, Roser; Giner-Bartolomé, Cristina; Aymamí, Neus; Gómez-Peña, Mónica; Santamaría, Juan J.; Forcano, Laura; Steward, Trevor; Menchón, José M.; Jiménez-Murcia, Susana

    2015-01-01

    Background: Gambling disorder (GD) is characterized by a significant lack of self-control and is associated with impulsivity-related personality traits. It is also linked to deficits in emotional regulation and frequently co-occurs with anxiety and depression symptoms. There is also evidence that emotional dysregulation may play a mediatory role between GD and psychopathological symptomatology. Few studies have reported the outcomes of psychological interventions that specifically address these underlying processes. Objectives: To assess the utility of the Playmancer platform, a serious video game, as an additional therapy tool in a CBT intervention for GD, and to estimate pre-post changes in measures of impulsivity, anger expression and psychopathological symptomatology. Method: The sample comprised a single group of 16 male treatment-seeking individuals with severe GD diagnosis. Therapy intervention consisted of 16 group weekly CBT sessions and, concurrently, 10 additional weekly sessions of a serious video game. Pre-post treatment scores on South Oaks Gambling Screen (SOGS), Barratt Impulsiveness Scale (BIS-11), I7 Impulsiveness Questionnaire (I7), State-Trait Anger Expression Inventory 2 (STAXI-2), Symptom Checklist-Revised (SCL-90-R), State-Trait Anxiety Inventory (STAI-S-T), and Novelty Seeking from the Temperament and Character Inventory-Revised (TCI-R) were compared. Results: After the intervention, significant changes were observed in several measures of impulsivity, anger expression and other psychopathological symptoms. Dropout and relapse rates during treatment were similar to those described in the literature for CBT. Conclusion: Complementing CBT interventions for GD with a specific therapy approach like a serious video game might be helpful in addressing certain underlying factors which are usually difficult to change, including impulsivity and anger expression. PMID:26617550

  16. Ballistocardiogram correction in simultaneous EEG/ fMRI recordings: a comparison of average artifact subtraction and optimal basis set methods using two popular software tools.

    PubMed

    Harrison, Amabilis H; Noseworthy, Michael D; Reilly, James P; Connolly, John F

    2014-01-01

    Electroencephalography data recorded during functional magnetic resonance imaging acquisition are subject to large cardiac-related artifacts that must be corrected during postprocessing. This study compared two widely used ballistocardiogram (BCG) correction algorithms as implemented in two software programs. Reduction of BCG amplitude, correlation of corrected data with electrocardiogram traces, correlation of independent components with electrocardiogram traces, and event-related potential signal-to-noise ratio from each algorithm were compared. Both algorithms effectively reduced the BCG artifact, with a slight advantage of average artifact subtraction over the optimal basis set method (0.1-2.2%) when the quality of the correction was examined at the individual subject level. This study provides users of these software tools with an important, practical, and previously unavailable comparison of the performance of these two methods.

  17. The Perfect Neuroimaging-Genetics-Computation Storm: Collision of Petabytes of Data, Millions of Hardware Devices and Thousands of Software Tools

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M.; Van Horn, John D.; Toga, Arthur W.

    2013-01-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data. PMID:23975276

  18. The perfect neuroimaging-genetics-computation storm: collision of petabytes of data, millions of hardware devices and thousands of software tools.

    PubMed

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M; Van Horn, John D; Toga, Arthur W

    2014-06-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data.

  19. Additive technology of soluble mold tooling for embedded devices in composite structures: A study on manufactured tolerances

    NASA Astrophysics Data System (ADS)

    Roy, Madhuparna

    Composite textiles have found widespread use and advantages in various industries and applications. The constant demand for high quality products and services requires companies to minimize their manufacturing costs, and delivery time in order to compete in general and niche marketplaces. Advanced manufacturing methods aim to provide economical methods of mold production. Creation of molding and tooling options for advanced composites encompasses a large portion of the fabrication time, making it a costly process and restraining factor. This research discusses a preliminary investigation into the use of soluble polymer compounds and additive manufacturing to fabricate soluble molds. These molds suffer from dimensional errors due to several factors, which have also been characterized. The basic soluble mold of a composite is 3D printed to meet the desired dimensions and geometry of holistic structures or spliced components. The time taken to dissolve the mold depends on the rate of agitation of the solvent. This process is steered towards enabling the implantation of optoelectronic devices within the composite to provide sensing capability for structural health monitoring. The shape deviation of the 3D printed mold is also studied and compared to its original dimensions to optimize the dimensional quality to produce dimensionally accurate parts. Mechanical tests were performed on compact tension (CT) resin samples prepared from these 3D printed molds and revealed crack propagation towards an embedded intact optical fiber.

  20. Demonstration of the Recent Additions in Modeling Capabilities for the WEC-Sim Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-03-01

    WEC-Sim is a mid-fidelity numerical tool for modeling wave energy conversion (WEC) devices. The code uses the MATLAB SimMechanics package to solve the multi-body dynamics and models the wave interactions using hydrodynamic coefficients derived from frequency domain boundary element methods. In this paper, the new modeling features introduced in the latest release of WEC-Sim will be presented. The first feature discussed is the conversion of the fluid memory kernel to a state-space approximation that provides significant gains in computational speed. The benefit of the state-space calculation becomes even greater after the hydrodynamic body-to-body coefficients are introduced as the number of interactions increases exponentially with the number of floating bodies. The final feature discussed is the capability toadd Morison elements to provide additional hydrodynamic damping and inertia. This is generally used as a tuning feature, because performance is highly dependent on the chosen coefficients. In this paper, a review of the hydrodynamic theory for each of the features is provided and successful implementation is verified using test cases.

  1. Software Update.

    ERIC Educational Resources Information Center

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  2. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1988-01-01

    Reviews three computer software programs: (1) "Discovery! Experiences with Scientific Reasoning"--problem solving for grades 4-12 (Apple II); (2) "Organic Stereochemistry"--a tutorial for organic chemistry for advanced secondary/college level (Apple II); and (3) "SHOW PARTNER (2.01)"--a graphics utility tool for…

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  5. Control Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Real-Time Innovations, Inc. (RTI) collaborated with Ames Research Center, the Jet Propulsion Laboratory and Stanford University to leverage NASA research to produce ControlShell software. RTI is the first "graduate" of Ames Research Center's Technology Commercialization Center. The ControlShell system was used extensively on a cooperative project to enhance the capabilities of a Russian-built Marsokhod rover being evaluated for eventual flight to Mars. RTI's ControlShell is complex, real-time command and control software, capable of processing information and controlling mechanical devices. One ControlShell tool is StethoScope. As a real-time data collection and display tool, StethoScope allows a user to see how a program is running without changing its execution. RTI has successfully applied its software savvy in other arenas, such as telecommunications, networking, video editing, semiconductor manufacturing, automobile systems, and medical imaging.

  6. Verifying nuclear fuel assemblies in wet storages on a partial defect level: A software simulation tool for evaluating the capabilities of the Digital Cherenkov Viewing Device

    NASA Astrophysics Data System (ADS)

    Grape, Sophie; Jacobsson Svärd, Staffan; Lindberg, Bo

    2013-01-01

    The Digital Cherenkov Viewing Device (DCVD) is an instrument that records the Cherenkov light emitted from irradiated nuclear fuels in wet storages. The presence, intensity and pattern of the Cherenkov light can be used by the International Atomic Energy Agency (IAEA) inspectors to verify that the fuel properties comply with declarations. The DCVD is since several years approved by the IAEA for gross defect verification, i.e. to control whether an item in a storage pool is a nuclear fuel assembly or a non-fuel item [1]. Recently, it has also been endorsed as a tool for partial defect verification, i.e. to identify if a fraction of the fuel rods in an assembly have been removed or replaced. The latter recognition was based on investigations of experimental studies on authentic fuel assemblies and of simulation studies on hypothetic cases of partial defects [2]. This paper describes the simulation methodology and software which was used in the partial defect capability evaluations. The developed simulation procedure uses three stand-alone software packages: the ORIGEN-ARP code [3] used to obtain the gamma-ray spectrum from the fission products in the fuel, the Monte Carlo toolkit Geant4 [4] for simulating the gamma-ray transport in and around the fuel and the emission of Cherenkov light, and the ray-tracing programme Zemax [5] used to model the light transport through the assembly geometry to the DCVD and to mimic the behaviour of its lens system. Furthermore, the software allows for detailed information from the plant operator on power and/or burnup distributions to be taken into account to enhance the authenticity of the simulated images. To demonstrate the results of the combined software packages, simulated and measured DCVD images are presented. A short discussion on the usefulness of the simulation tool is also included.

  7. Man versus Machine: Software Training for Surgeons-An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance.

    PubMed

    Din, Nizar; Smith, Phillip; Emeriewen, Krisztina; Sharma, Anant; Jones, Simon; Wawrzynski, James; Tang, Hongying; Sullivan, Paul; Caputo, Silvestro; Saleh, George M

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of -0.6792619 (p = 0.001), -0.6652021 (p = 0.002), and -0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills.

  8. Man versus Machine: Software Training for Surgeons—An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance

    PubMed Central

    Smith, Phillip; Sharma, Anant; Jones, Simon; Sullivan, Paul

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of −0.6792619 (p = 0.001), −0.6652021 (p = 0.002), and −0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills. PMID:27867658

  9. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    PubMed

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide.

  10. Software tools that facilitate kinetic modelling with large data sets: an example using growth modelling in sugarcane.

    PubMed

    Uys, L; Hofmeyr, J H S; Snoep, J L; Rohwer, J M

    2006-09-01

    A solution to manage cumbersome data sets associated with large modelling projects is described. A kinetic model of sucrose accumulation in sugarcane is used to predict changes in sucrose metabolism with sugarcane internode maturity. This results in large amounts of output data to be analysed. Growth is simulated by reassigning maximal activity values, specific to each internode of the sugarcane plant, to parameter attributes of a model object. From a programming perspective, only one model definition file is required for the simulation software used; however, the amount of input data increases with each extra interrnode that is modelled, and likewise the amount of output data that is generated also increases. To store, manipulate and analyse these data, the modelling was performed from within a spreadsheet. This was made possible by the scripting language Python and the modelling software PySCeS through an embedded Python interpreter available in the Gnumeric spreadsheet program.

  11. Test and Evaluation of WiMAX Performance Using Open-Source Modeling and Simulation Software Tools

    DTIC Science & Technology

    2010-12-01

    include ns- 2/ns-3 (NSNAM), OPNET ( OPNET Technologies, Inc.), and QualNet (Scalable Network Technologies). Two important requirements for our T&E of...comparison, OPNET is proprietary software requiring the purchase of multiple licenses and support options at a current cost approaching $60K. An integrated...graphical user interface (GUI) is lacking in ns-2/ns- 3, requiring C/C++ programming in order to configure the simulation. OPNET , on the other hand

  12. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  13. PhasePlot: An Interactive Software Tool for Visualizing Phase Relations, Performing Virtual Experiments, and for Teaching Thermodynamic Concepts in Petrology

    NASA Astrophysics Data System (ADS)

    Ghiorso, M. S.

    2012-12-01

    The computer program PhasePlot was developed for Macintosh computers and released via the Mac App Store in December 2011. It permits the visualization of phase relations calculated from internally consistent thermodynamic data-model collections, including those from MELTS (Ghiorso and Sack, 1995, CMP 119, 197-212), pMELTS (Ghiorso et al., 2002, G-cubed 3, 10.1029/2001GC000217) and the deep mantle database of Stixrude and Lithgow-Bertelloni (2011, GJI 184, 1180-1213). The software allows users to enter a system bulk composition and a range of reference conditions, and then calculate a grid of phase relations. These relations may be visualized in a variety of ways including pseudosections, phase diagrams, phase proportion plots, and contour diagrams of phase compositions and abundances. The program interface is user friendly and the computations are fast on laptop-scale machines, which makes PhasePlot amenable to in-class demonstrations, as a tool in instructional laboratories, and as an aid in support of out-of-class exercises and research. Users focus on problem specification and interpretation of results rather than on manipulation and mechanics of computation. The software has been developed with NSF support and is free. The PhasePlot web site is at phaseplot.org where extensive user documentation, video tutorials and examples of use may be found. The original release of phase plot permitted calculations to be performed on pressure-, temperature-grids (P-T), by direct minimization of the Gibbs free energy of the system at each grid point. A revision of PhasePlot (scheduled for release to the Mac App Store in December 2012) extends capabilities to include pressure-, entropy-grids (P-S) by system enthalpy minimization, volume-, temperature-grids (V-T) by system Helmholtz energy minimization, and volume-,entropy-grids (V-S) by minimization of the Internal Energy of the system. P-S gridded results may be utilized to visualize phase relations as a function of heat

  14. Metaxa: a software tool for automated detection and discrimination among ribosomal small subunit (12S/16S/18S) sequences of archaea, bacteria, eukaryotes, mitochondria, and chloroplasts in metagenomes and environmental sequencing datasets.

    PubMed

    Bengtsson, Johan; Eriksson, K Martin; Hartmann, Martin; Wang, Zheng; Shenoy, Belle Damodara; Grelet, Gwen-Aëlle; Abarenkov, Kessy; Petri, Anna; Rosenblad, Magnus Alm; Nilsson, R Henrik

    2011-10-01

    The ribosomal small subunit (SSU) rRNA gene has emerged as an important genetic marker for taxonomic identification in environmental sequencing datasets. In addition to being present in the nucleus of eukaryotes and the core genome of prokaryotes, the gene is also found in the mitochondria of eukaryotes and in the chloroplasts of photosynthetic eukaryotes. These three sets of genes are conceptually paralogous and should in most situations not be aligned and analyzed jointly. To identify the origin of SSU sequences in complex sequence datasets has hitherto been a time-consuming and largely manual undertaking. However, the present study introduces Metaxa ( http://microbiology.se/software/metaxa/ ), an automated software tool to extract full-length and partial SSU sequences from larger sequence datasets and assign them to an archaeal, bacterial, nuclear eukaryote, mitochondrial, or chloroplast origin. Using data from reference databases and from full-length organelle and organism genomes, we show that Metaxa detects and scores SSU sequences for origin with very low proportions of false positives and negatives. We believe that this tool will be useful in microbial and evolutionary ecology as well as in metagenomics.

  15. Deciphering P-T paths in metamorphic rocks involving zoned minerals using quantified maps (XMapTools software) and thermodynamics methods: Examples from the Alps and the Himalaya.

    NASA Astrophysics Data System (ADS)

    Lanari, P.; Vidal, O.; Schwartz, S.; Riel, N.; Guillot, S.; Lewin, E.

    2012-04-01

    Metamorphic rocks are made by mosaic of local thermodynamic equilibria involving minerals that grew at different temporal, pressure (P) and temperature (T) conditions. These local (in space but also in time) equilibria can be identified using micro-structural and textural criteria, but also tested using multi-equilibrium techniques. However, linking deformation with metamorphic conditions requires spatially continuous estimates of P and T conditions in least two dimensions (P-T maps), which can be superimposed to the observed structures of deformation. To this end, we have developed a new Matlab-based GUI software for microprobe X-ray map processing (XMapTools, http://www.xmaptools.com) based on the quantification method of De Andrade et al. (2006). XMapTools software includes functions for quantification processing, two chemical modules (Chem2D, Triplot3D), the structural formula functions for common minerals, and more than 50 empirical and semi-empirical geothermobarometers obtained from the literature. XMapTools software can be easily coupled with multi-equilibrium thermobarometric calculations. We will present examples of application for two natural cases involving zoned minerals. The first example is a low-grade metapelite from the paleo-subduction wedge in the Western Alps (Schistes Lustrés unit) that contains only both zoned chlorite and phengite, and also quartz. The second sample is a Himalayan eclogite from the high-pressure unit of Stak (Pakistan) with an eclogitic garnet-omphacite assemblage retrogressed into clinopyroxene-plagioclase-amphibole symplectite, and later into amphibole-biotite during the collisional event under crustal conditions. In both samples, P-T paths were recovered using multi-equilibrium, or semi-empirical geothermobarometers included in the XMapTools package. The results will be compared and discussed with pseudosections calculated with the sample bulk composition and with different local bulk rock compositions estimated with XMapTools

  16. MetMatch: A Semi-Automated Software Tool for the Comparison and Alignment of LC-HRMS Data from Different Metabolomics Experiments

    PubMed Central

    Koch, Stefan; Bueschl, Christoph; Doppler, Maria; Simader, Alexandra; Meng-Reiterer, Jacqueline; Lemmens, Marc; Schuhmacher, Rainer

    2016-01-01

    Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio (m/z) values and retention times) that serves as a reference, the tool recognizes both m/z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m/z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley. PMID:27827849

  17. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  18. Social Software and Academic Practice: Postgraduate Students as Co-Designers of Web 2.0 Tools

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Burchmore, Helen

    2010-01-01

    In order to develop potentially transformative Web 2.0 tools in higher education, the complexity of existing academic practices, including current patterns of technology use, must be recognised. This paper describes how a series of participatory design activities allowed postgraduate students in education, social sciences and computer sciences to…

  19. Computer Lab Tools for Science: An Analysis of Commercially Available Science Interfacing Software for Microcomputers. A Quarterly Report.

    ERIC Educational Resources Information Center

    Weaver, Dave

    Science interfacing packages (also known as microcomputer-based laboratories or probeware) generally consist of a set of programs on disks, a user's manual, and hardware which includes one or more sensory devices. Together with a microcomputer they combine to make a powerful data acquisition and analysis tool. Packages are available for accurately…

  20. How Does Skype, as an Online Communication Software Tool, Contribute to K-12 Administrators' Level of Self-Efficacy?

    ERIC Educational Resources Information Center

    Kiriakidis, Peter

    2012-01-01

    How does Skype, as an online communication tool, contribute to school and district administrators' reported level of self-efficacy? A sample of n = 39 participants of which 22 were school administrators and 17 were district administrators was purposefully selected to use Skype in their offices with a webcam and microphone to communicate with other…