Science.gov

Sample records for addition software tools

  1. Software engineering tools.

    PubMed

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development. PMID:10131419

  2. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  3. Modern Tools for Modern Software

    SciTech Connect

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  4. CSAM Metrology Software Tool

    NASA Technical Reports Server (NTRS)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  5. NASA Software Estimating Tool (N-SET)

    NASA Technical Reports Server (NTRS)

    Stukes, Sherry

    2006-01-01

    The goals of this project are to: Develop an early lifecycle software cost estimation tool leveraging existing data and capabilities Collect additional software data from: a) Jet Propulsion Laboratory; b) Goddard Space Flight Center; and c) Marshall Space Flight Center. Analyze, normalize, evaluate, stratify, and validate data. Create a calibrated, validated, and documented tool initially using available data and subsequently using newly collected data.

  6. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  7. Biological Imaging Software Tools

    PubMed Central

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  8. Software Tools: EPICUR.

    ERIC Educational Resources Information Center

    Abreu, Jose Luis; And Others

    EPICUR (Integrated Programing Environment for the Development of Educational Software) is a set of programming modules ranging from low level interfaces to high level algorithms aimed at the development of computer-assisted instruction (CAI) applications. The emphasis is on user-friendly interfaces and on multiplying productivity without loss of…

  9. Software tools for optical interferometry

    NASA Astrophysics Data System (ADS)

    Thureau, Nathalie D.; Ireland, Michael; Monnier, John D.; Pedretti, Ettore

    2006-06-01

    We describe a set of general purpose utilities for visualizing and manipulating optical interferometry data stored in the FITS-based OIFITS data format. This class of routines contains code like the OiPlot navigation/visualization tool which allows the user to extract visibility, closure phase and UV-coverage information from the OIFITS files and to display the information in various ways. OiPlot also has basic data model fitting capabilities which can be used for a rapid first analysis of the scientific data. More advanced image reconstruction techniques are part of a dedicated utility. In addition, these routines allow data from multiple interferometers to be combined and used together. Part of our work also aims at developing software specific to the Michigan InfraRed Combiner (MIRC). Our experience designing a flexible and robust graphical user interfaced based on sockets using python libraries has wide applicability and this paper will discuss practicalities.

  10. Fermilab Software Tools Program: Fermitools

    SciTech Connect

    Pordes, R.

    1995-10-01

    The Fermilab Software Tools Program (Fermitools) was established in 1994 as an intiative under which Fermilab provides software it has developed to outside collaborators. During the year and a half since its start ten software products have been packaged and made available on the official Fermilab anonymous ftp site, and backup support and information services have been made available for them. During the past decade, institutions outside the Fermilab physics experiment user community have in general only been able to obtain and use Fermilab developed software on an adhoc or informal basis. With the Fermitools program the Fermilab Computing Division has instituted an umbrella under which software that is regarded by its internal user community as useful and of high quality can be provided to users outside of High Energy Physics experiments. The main thrust of the Fermitools program is stimulating collaborative use and further development of the software. Having established minimal umbrella beaurocracy makes collaborative development and support easier. The published caveat given to people who take the software includes the statement ``Provision of the software implies no commitment of support by Fermilab. The Fermilab Computing Division is open to discussing other levels of support for use of the software with responsible and committed users and collaborator``. There have been no negative comments in response to this and the policy has not given rise to any questions or complaints. In this paper we present the goals and strategy of the program and introduce some of the software made available through it. We discuss our experiences to date and mention the perceived benefits of the Program.

  11. Component Modeling Approach Software Tool

    2010-08-23

    The Component Modeling Approach Software Tool (CMAST) establishes a set of performance libraries of approved components (frames, glass, and spacer) which can be accessed for configuring fenestration products for a project, and btaining a U-factor, Solar Heat Gain Coefficient (SHGC), and Visible Transmittance (VT) rating for those products, which can then be reflected in a CMA Label Certificate for code compliance. CMAST is web-based as well as client-based. The completed CMA program and software toolmore » will be useful in several ways for a vast array of stakeholders in the industry: Generating performance ratings for bidding projects Ascertaining credible and accurate performance data Obtaining third party certification of overall product performance for code compliance« less

  12. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  13. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  14. AUTOSIM: An automated repetitive software testing tool

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Mcbride, S. E.

    1985-01-01

    AUTOSIM is a software tool which automates the repetitive run testing of software. This tool executes programming tasks previously performed by a programmer with one year of programming experience. Use of the AUTOSIM tool requires a knowledge base containing information about known faults, code fixes, and the fault diagnosis-correction process. AUTOSIM can be considered as an expert system which replaces a low level of programming expertise. Reference information about the design and implementation of the AUTOSIM software test tool provides flowcharts to assist in maintaining the software code and a description of how to use the tool.

  15. Software engineering environment tool set integration

    NASA Technical Reports Server (NTRS)

    Selfridge, William P.

    1986-01-01

    Space Transportation System Division (STSD) Engineering has a program to promote excellence within the engineering function. This program resulted in a capital funded facility based on a VAX cluster called the Rockwell Operational Engineering System (ROSES). The second phase of a three phase plan to establish an integrated software engineering environment for ROSES is examined. It discusses briefly phase one which establishes the basic capability for a modern software development environment to include a tool set, training and standards. Phase two is a tool set integration. The tool set is primarily off-the-shelf tools acquired through vendors or government agencies (public domain). These tools were placed into categories of software development. These categories are: requirements, design, and construction support; verification and validation support; and software management support. The integration of the tool set is being performed through concept prototyping and development of tools specifically designed to support the life cycle and provide transition from one phase to the next.

  16. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  17. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  18. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  19. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects. PMID:18946980

  20. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  1. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  2. Software development tools: A bibliography, appendix C.

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.

    1980-01-01

    A bibliography containing approximately 200 citations on tools which help software developers perform some development task (such as text manipulation, testing, etc.), and which would not necessarily be found as part of a computing facility is given. The bibliography comes from a relatively random sampling of the literature and is not complete. But it is indicative of the nature and range of tools currently being prepared or currently available.

  3. Software Tools for Empowering Instructional Developers.

    ERIC Educational Resources Information Center

    Gayeski, Diane M.

    1991-01-01

    Software systems are being created to assist both novice and expert instructional technologists in response to perceived need of organizations to increase their training. Underlying philosophies and goals of instructional developer automation tools and their potential effects upon the organizations who adopt them must be examined so they will help…

  4. Commercial Expert-System-Building Software Tools

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1989-01-01

    Report evaluates commercially-available expert-system-building tools in terms of structures, representations of knowledge, inference mechanisms, interfaces with developers and end users, and capabilities of performing such functions as diagnosis and design. Software tools commercialized derivatives of artificial-intelligence systems developed by researchers at universities and research organizations. Reducing time to develop expert system by order of magnitude compared to that required with such traditional artificial development languages as LISP. Table lists 20 such tools, rating attributes as strong, fair, programmable by user, or having no capability in various criteria.

  5. SUSTAINABLE REMEDIATION SOFTWARE TOOL EXERCISE AND EVALUATION

    SciTech Connect

    Kohn, J.; Nichols, R.; Looney, B.

    2011-05-12

    The goal of this study was to examine two different software tools designed to account for the environmental impacts of remediation projects. Three case studies from the Savannah River Site (SRS) near Aiken, SC were used to exercise SiteWise (SW) and Sustainable Remediation Tool (SRT) by including both traditional and novel remediation techniques, contaminants, and contaminated media. This study combined retrospective analysis of implemented projects with prospective analysis of options that were not implemented. Input data were derived from engineering plans, project reports, and planning documents with a few factors supplied from calculations based on Life Cycle Assessment (LCA). Conclusions drawn from software output were generally consistent within a tool; both tools identified the same remediation options as the 'best' for a given site. Magnitudes of impacts varied between the two tools, and it was not always possible to identify the source of the disagreement. The tools differed in their quantitative approaches: SRT based impacts on specific contaminants, media, and site geometry and modeled contaminant removal. SW based impacts on processes and equipment instead of chemical modeling. While SW was able to handle greater variety in remediation scenarios, it did not include a measure of the effectiveness of the scenario.

  6. New Software Framework to Share Research Tools

    NASA Astrophysics Data System (ADS)

    Milner, Kevin; Becker, Thorsten W.; Boschi, Lapo; Sain, Jared; Schorlemmer, Danijel; Waterhouse, Hannah

    2009-03-01

    Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. The software provides a stand-alone open-source package that allows users to operate in a “black box” mode, which hides implementation details, while also allowing them to dig deeper into the underlying source code. The overlying user interfaces are written in the Python programming language using a modern, object-oriented design, including graphical user interactions. SEATREE, which provides an interface to a range of new and existing lower level programs that can be written in any computer programming language, may in the long run contribute to new ways of sharing scientific research. By sharing both data and modeling tools in a consistent framework, published (numerical) experiments can be made truly reproducible again.

  7. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  8. A software tool for dataflow graph scheduling

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1994-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.

  9. Structure and software tools of AIDA.

    PubMed

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  10. Knowledge engineering software: A demonstration of a high end tool

    SciTech Connect

    Salzman, G.C.; Krall, R.B.; Marinuzzi, J.G.

    1987-01-01

    Many investigators wanting to apply knowledge-based systems (KBS) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with the high end KBS tools such as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. To illustrate the capability of one of these high end tools we have converted a table showing the classification of benign soft tissue tumors into a KEE knowledge base. We have used the tools available in Kee to identify the tumor type for a hypothetical patient. 10 figs.

  11. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  12. Additive manufacturing of tools for lapping glass

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2013-09-01

    Additive manufacturing technologies have the ability to directly produce parts with complex geometries without the need for secondary processes, tooling or fixtures. This ability was used to produce concave lapping tools with a VFlash 3D printer from 3D Systems. The lapping tools were first designed in Creo Parametric with a defined constant radius and radial groove pattern. The models were converted to stereolithography files which the VFlash used in building the parts, layer by layer, from a UV curable resin. The tools were rotated at 60 rpm and used with 120 grit and 220 grit silicon carbide lapping paste to lap 0.750" diameter fused silica workpieces. The samples developed a matte appearance on the lapped surface that started as a ring at the edge of the workpiece and expanded to the center. This indicated that as material was removed, the workpiece radius was beginning to match the tool radius. The workpieces were then cleaned and lapped on a second tool (with equivalent geometry) using a 3000 grit corundum aluminum oxide lapping paste, until a near specular surface was achieved. By using lapping tools that have been additively manufactured, fused silica workpieces can be lapped to approach a specified convex geometry. This approach may enable more rapid lapping of near net shape workpieces that minimize the material removal required by subsequent polishing. This research may also enable development of new lapping tool geometry and groove patterns for improved loose abrasive finishing.

  13. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  14. Software Tools for Stochastic Simulations of Turbulence

    NASA Astrophysics Data System (ADS)

    Kaufman, Ryan

    We present two software tools useful for the analysis of mesh based physics application data, and specifically for turbulent mixing simulations. Each has a broader, but separate scope, as we describe. Both features play a key role as we push computational science to its limits and thus the present work contributes to the frontier of research. The first tool is Wstar, a weak* comparison tool, which addresses the stochastic nature of turbulent flow. The goal is to compare underresolved turbulent data in convergence, parameter dependence, or validation studies. This is achieved by separating space-time data from state data (e.g. density, pressure, momentum, etc.) through coarsening and sampling. The collection of fine grained data in a single coarse cell is treated as a random sample in state space, whose cumulative distribution function defines a measure within that cell. This set of measures with the spacial dependence defined by the coarse grid defines a Young measure solution to the PDE. The second tool is a front tracking application programming interface (API) called FTI. It has the capability to generate geometric surfaces (e.g. the location of interspecies boundaries) of high complexity, and track them dynamically. FTI also includes the ghost fluid method, which enables mesh based fluid codes to maintain sharpness at interspecies boundaries by modifying solution stencils that cross such a boundary. FTI outlines and standardizes the methods involved in this model. FronTier, as developed here, is a software package which implements this standard. The client must implement the physics and grid interpolation routines outlined in the client interface to FTI. Specific client programs using this interface include the weather forecasting code WRF; the high energy physics code, FLASH; and two locally constructed fluid codes, cFluid and iFluid for compressible and incompressible flow respectively.

  15. STAYSL PNNL Suite of Software Tools.

    SciTech Connect

    GREENWOOD, LARRY R.

    2013-07-19

    Version: 00 The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  16. STAYSL PNNL Suite of Software Tools.

    2013-07-19

    Version: 00 The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist ofmore » the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.« less

  17. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  18. Tool support for software lookup table optimization

    PubMed Central

    Strout, Michelle Mills; Bieman, James M.

    2012-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  19. Tool support for software lookup table optimization.

    PubMed

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M

    2011-12-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  20. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  1. Tool Support for Software Lookup Table Optimization

    DOE PAGESBeta

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  2. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    SciTech Connect

    Not Available

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  3. Tools and Behavioral Abstraction: A Direction for Software Engineering

    NASA Astrophysics Data System (ADS)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  4. Integration of case tools for software project management

    SciTech Connect

    Paul, R.; Shinagawa, Y.; Khan, M.F.

    1996-12-31

    Building and maintenance of high quality large software projects is a complex and difficult process. Tools employing software metrics are becoming an effective aid for management of such large projects. In this paper, we briefly trace the evolution of such tools from their beginnings up until the current trends of integrated CASE tools. We present a generic integrated CASE environment incorporating a formal set of software metrics with a suite of advanced analytic techniques. The proposed integrated CASE environment is an enhancement of currently used tools, and can enable more efficient and cost-effective management of large and complex software projects.

  5. VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1994-01-01

    VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.

  6. CASRE ?? Eay-to-Use Software Reliability Measurement Tool

    NASA Technical Reports Server (NTRS)

    Nikora, A.; Lyu, M.; Farr, W.

    1993-01-01

    This paper describes the implementation of a software reliability measurement tool, CASRE, that incorporates the methematical modeling capabilities of the public domain tool SMERFS, and is being implemented in a Microsoft Windows environment.

  7. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  8. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  9. A Tool for Managing Software Architecture Knowledge

    SciTech Connect

    Babar, Muhammad A.; Gorton, Ian

    2007-08-01

    This paper describes a tool for managing architectural knowledge and rationale. The tool has been developed to support a framework for capturing and using architectural knowledge to improve the architecture process. This paper describes the main architectural components and features of the tool. The paper also provides examples of using the tool for supporting wellknown architecture design and analysis methods.

  10. Free software tools for atlas-based volumetric neuroimage analysis

    NASA Astrophysics Data System (ADS)

    Bazin, Pierre-Louis; Pham, Dzung L.; Gandler, William; McAuliffe, Matthew

    2005-04-01

    We describe new and freely available software tools for measuring volumes in subregions of the brain. The method is fast, flexible, and employs well-studied techniques based on the Talairach-Tournoux atlas. The software tools are released as plug-ins for MIPAV, a freely available and user-friendly image analysis software package developed by the National Institutes of Health. Our software tools include a digital Talairach atlas that consists of labels for 148 different substructures of the brain at various scales.

  11. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  12. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  13. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  14. EISA 432 Energy Audits Best Practices: Software Tools

    SciTech Connect

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  15. Tools Ensure Reliability of Critical Software

    NASA Technical Reports Server (NTRS)

    2012-01-01

    In November 2006, after attempting to make a routine maneuver, NASA's Mars Global Surveyor (MGS) reported unexpected errors. The onboard software switched to backup resources, and a 2-day lapse in communication took place between the spacecraft and Earth. When a signal was finally received, it indicated that MGS had entered safe mode, a state of restricted activity in which the computer awaits instructions from Earth. After more than 9 years of successful operation gathering data and snapping pictures of Mars to characterize the planet's land and weather communication between MGS and Earth suddenly stopped. Months later, a report from NASA's internal review board found the spacecraft's battery failed due to an unfortunate sequence of events. Updates to the spacecraft's software, which had taken place months earlier, were written to the wrong memory address in the spacecraft's computer. In short, the mission ended because of a software defect. Over the last decade, spacecraft have become increasingly reliant on software to carry out mission operations. In fact, the next mission to Mars, the Mars Science Laboratory, will rely on more software than all earlier missions to Mars combined. According to Gerard Holzmann, manager at the Laboratory for Reliable Software (LaRS) at NASA's Jet Propulsion Laboratory (JPL), even the fault protection systems on a spacecraft are mostly software-based. For reasons like these, well-functioning software is critical for NASA. In the same year as the failure of MGS, Holzmann presented a new approach to critical software development to help reduce risk and provide consistency. He proposed The Power of 10: Rules for Developing Safety-Critical Code, which is a small set of rules that can easily be remembered, clearly relate to risk, and allow compliance to be verified. The reaction at JPL was positive, and developers in the private sector embraced Holzmann's ideas.

  16. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  17. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  18. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  19. NASA Approach to HPCCP Support Software and Tools

    NASA Technical Reports Server (NTRS)

    Blaylock, Bruce; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    The NASA HPCC Program, together with other agencies participating in the Federal HPCC Program, intends to advance technologies to enable the execution of grand challenge applications at sustained rates up to TeraFLOPS. During 1995-6 NASA undertook two major systems software efforts to improve the state of high performance support software and tools. The first of these activities was a replanning of support software and tools activities internal to the Agency. In replanning the software activities emphasis was placed on Meeting the needs of Grand Challenge Uses Few projects Near term useful results. The revised NASA plan calls for support software and tools activities in four areas: Application Creation Process Support Application Usage/Operations Support Advanced Support Software and Tools Concepts Metrics Based Monitoring and Management The second major activity undertaken was participation in a multiagency Task Force resulting from the Second Pasadena Workshop on System Software and Tools. The task force developed the Guidelines for Writing System Software and Tools Requirements for Parallel and Clustered Computers.

  20. Innovative Software Tools Measure Behavioral Alertness

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  1. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  2. ISWHM: Tools and Techniques for Software and System Health Management

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  3. Developing a Decision Support System: The Software and Hardware Tools.

    ERIC Educational Resources Information Center

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  4. Some Interactive Aspects of a Software Design Schema Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Lee, Hing-Yan; Harandi, Mehdi T.

    1991-01-01

    This paper describes a design schema acquisition tool which forms an important component of a hybrid software design system for reuse. The hybrid system incorporates both schema-based approaches in supporting software design reuse activities and is realized by extensions to the IDeA system. The paper also examines some of the interactive aspects that the tool requires with the domain analyst to accomplish its acquisition task.

  5. Software tool for xenon gamma-ray spectrometer control

    NASA Astrophysics Data System (ADS)

    Chernysheva, I. V.; Novikov, A. S.; Shustov, A. E.; Dmitrenko, V. V.; Pyae Nyein, Sone; Petrenko, D.; Ulin, S. E.; Uteshev, Z. M.; Vlasik, K. F.

    2016-02-01

    Software tool "Acquisition and processing of gamma-ray spectra" for xenon gamma-ray spectrometers control was developed. It supports the multi-windows interface. Software tool has the possibilities for acquisition of gamma-ray spectra from xenon gamma-ray detector via USB or RS-485 interfaces, directly or via TCP-IP protocol, energy calibration of gamma-ray spectra, saving gamma-ray spectra on a disk.

  6. iPhone examination with modern forensic software tools

    NASA Astrophysics Data System (ADS)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  7. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  8. Software Tools for Weed Seed Germination Modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...

  9. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  10. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  11. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations. PMID:26262417

  12. Development of Fuel Accounting Software Tool

    NASA Astrophysics Data System (ADS)

    Eun, Jong Won; Suk, Juil

    1996-12-01

    A successful spacecraft mission depends on the proper maintenance of the orbit and attitude. One important requirement for the orbit and attitude planning is the accurate estimation of the propellant remaining onboard the spacecraft. For GEO communi-cations satellite, a precise fuel remaining estimation is of particular importance. This paper focuses on the bookkeeping method that was developed for calculating the pro-pellant budget by recording fuel consumption history. In general, the bookkeeping method includes detailed observation of spacecraft maneuver operations throughout the whole mission life. Application of this method is illustrated using a communica-tions satellite. In this the fuel accounting s/w tool, a PC-based spread sheet is utilized to provide an overall view of input/output elements, and to provide strong numerical and graphical merits for analyses.

  13. A Dynamic MPI Software Correctness Checking Tool

    2005-10-31

    Umpire is prototype tool developed at LLNL by Bronis R. de Supinski, J. M. May, Martin Schulz and Jeffery Vetter as part of the ASDE TRTS project for detecting programming errors at runtime in message passing applications. Umpire monitors the MPI operations of an application by interposing itself between the application and the MPI runtime system using the MPI profiling layer. Umpire then checks the application’s MPI behavior for specific errors. Umpire detects errors thatmore » are local to individual MPI tasks, including resource errors (e.g., leaks of MPI datatypes and other opaque objects) and overwrites of non-blocking send buffers. It also detects distributed errors, including deadlocks involving any MPI-1 constructs and datatype mismatches between matching communication operations.« less

  14. Talkoot: software tool to create collaboratories for earth science

    SciTech Connect

    Movva, Sunil; Ramachandran, Rahul; Maskey, Manil; Kulkarni, Ajinkya; Conover, Helen; Nair, U.S.

    2012-01-01

    Open science, where researchers share and publish every element of their research process in addition to the final results, can foster novel ways of collaboration among researchers and has the potential to spontaneously create new virtual research collaborations. Based on scientific interest, these new virtual research collaborations can cut across traditional boundaries such as institutions and organizations. Advances in technology allow for software tools that can be used by different research groups and institutions to build and support virtual collaborations and infuse open science. This paper describes Talkoot, a software toolkit designed and developed by the authors to provide Earth Science researchers a ready-to-use knowledge management environment and an online platform for collaboration. Talkoot allows Earth Science researchers a means to systematically gather, tag and share their data, analysis workflows and research notes. These Talkoot features are designed to foster rapid knowledge sharing within a virtual community. Talkoot can be utilized by small to medium sized groups and research centers, as well as large enterprises such a national laboratories and federal agencies.

  15. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  16. Software tool for data mining and its applications

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  17. DEVICE CONTROL TOOL FOR CEBAF BEAM DIAGNOSTICS SOFTWARE

    SciTech Connect

    Pavel Chevtsov

    2008-02-11

    Continuously monitoring the beam quality in the CEBAF accelerator, a variety of beam diagnostics software created at Jefferson Lab makes a significant contribution to very high availability of the machine for nuclear physics experiments. The interface between this software and beam instrumentation hardware components is provided by a device control tool, which is optimized for beam diagnostics tasks. As a part of the device/driver development framework at Jefferson Lab, this tool is very easy to support and extend to integrate new beam instrumentation components. All device control functions are based on the configuration (ASCII text) files that completely define the used hardware interface standards (CAMAC, VME, RS-232, GPIB, etc.) and communication protocols. The paper presents the main elements of the device control tool for beam diagnostics software at Jefferson Lab.

  18. Management of Astronomical Software Projects with Open Source Tools

    NASA Astrophysics Data System (ADS)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  19. Concepts and tools for the software life cycle

    NASA Astrophysics Data System (ADS)

    Tausworthe, Robert C.

    1985-10-01

    The life cycle process for large software-intensive systems is an extremely intricate and complex process involving many people performing amid a very large base of evolving computer programs, documentation and data. To be successful, the process must be well conceived, planned and conducted; however, the nature of scientific and other high-technology projects involving large-scale software is such that conceptualization, planning and implementation to the degree of detail required is so laborintensive and unmotivating as to be counter-productive and seldom cost-effective. The tools, techniques and aids needed to engineer, manage and administrate a large software-intensive task are themselves parts of a large software base, and are incurred only at great expense. This paper focuses on the needs of the software life cycle in terms of supporting tools and methodologies. The concept of a distributed network for engineering, management and administrative functions for engineering, management and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineered approach toward the construction of uniform, coordinated tools is proposed as a means to reduce development and maintenance costs, foster creativity, enhance reliability, promote standardization and sustain human motivation.

  20. TRAVIT: software tool to simulate dry etch in maskmaking

    NASA Astrophysics Data System (ADS)

    Babin, S.; Bay, K.; Okulovsky, S.

    2005-06-01

    A software tool, TRAVIT, has been developed to simulate dry etch in maskmaking. The software predicts the etch profile, etched critical dimensions (CDs), and CD-variation for any pattern of interest. The software also takes into account microloading effect that is pattern dependent and contributes to CD variation. Once CD variation is known, it can then be applied to correct the CD-error. Examples of simulations including variable ICP power, physical and chemical etch components, and optimization of a bias and CD variation are presented. Incorporating simulation into the maskmaking process can save cost and shorten the time to production.

  1. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  2. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  3. [Utility of noise addition image made by using water phantom and image addition and subtraction software].

    PubMed

    Watanabe, Ryo; Ogawa, Masato; Mituzono, Hiroki; Aoki, Takahiro; Hayano, Mizuho; Watanabe, Yuka

    2010-08-20

    In optimizing exposures, it is very important to evaluate the impact of image noise on image quality. To realize this, there is a need to evaluate how much image noise will make the subject disease invisible. But generally it is very difficult to shoot images of different quality in a clinical examination. Thus, a method to create a noise addition image by adding the image noise to raw data has been reported. However, this approach requires a special system, so it is difficult to implement in many facilities. We have invented a method to easily create a noise addition image by using the water phantom and image add-subtract software that accompanies the device. To create a noise addition image, first we made a noise image by subtracting the water phantom with different SD. A noise addition image was then created by adding the noise image to the original image. By using this method, a simulation image with intergraded SD can be created from the original. Moreover, the noise frequency component of the created noise addition image is as same as the real image. Thus, the relationship of image quality to SD in the clinical image can be evaluated. Although this method is an easy method of LDSI creation on image data, a noise addition image can be easily created by using image addition and subtraction software and water phantom, and this can be implemented in many facilities. PMID:20953102

  4. Use of software tools in the development of real time software systems

    NASA Technical Reports Server (NTRS)

    Garvey, R. C.

    1981-01-01

    The transformation of a preexisting software system into a larger and more versatile system with different mission requirements is discussed. The history of this transformation is used to illustrate the use of structured real time programming techniques and tools to produce maintainable and somewhat transportable systems. The predecessor system is a single ground diagnostic system; its purpose is to exercise a computer controlled hardware set prior to its deployment in its functional environment, as well as test the equipment set by supplying certain well known stimulas. The successor system (FTE) is required to perform certain testing and control functions while this hardware set is in its functional environment. Both systems must deal with heavy user input/output loads and a new I/O requirement is included in the design of the FTF system. Human factors are enhanced by adding an improved console interface and special function keyboard handler. The additional features require the inclusion of much new software to the original set from which FTF was developed. As a result, it is necessary to split the system into a duel programming configuration with high rates of interground communications. A generalized information routing mechanism is used to support this configuration.

  5. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  6. Proposing a Mathematical Software Tool in Physics Secondary Education

    ERIC Educational Resources Information Center

    Baltzis, Konstantinos B.

    2009-01-01

    MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…

  7. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  8. Using Software Tools to Automate the Assessment of Student Programs.

    ERIC Educational Resources Information Center

    Jackson, David

    1991-01-01

    Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…

  9. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  10. Role of Social Software Tools in Education: A Literature Review

    ERIC Educational Resources Information Center

    Minocha, Shailey

    2009-01-01

    Purpose: The purpose of this paper is to provide a review of literature on the role of Web 2.0 or social software tools in education. Design/methodology/approach: This paper is a critical and comprehensive review of a range of literature sources (until January 2009) addressing the various issues related to the educator's perspective of pedagogical…

  11. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  12. Simple tools and software for precision weed mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Simple Tools and Software for Precision Weed Mapping L. Wiles If you have a color digital camera and a handheld GPS unit, you can map weed problems in your fields. German researchers are perfecting technology to map weed species and density with digital cameras for precision herbicide application. ...

  13. GenePRIMP: A software quality control tool

    SciTech Connect

    Amrita Pati

    2010-05-05

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  14. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2010-09-01

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  15. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  16. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  17. A Software Communication Tool for the Tele-ICU

    PubMed Central

    Pimintel, Denise M.; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider. PMID:24551398

  18. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  19. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  20. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  1. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  2. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  3. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  4. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  5. Evaluation of free non-diagnostic DICOM software tools

    NASA Astrophysics Data System (ADS)

    Liao, Wei; Deserno, Thomas M.; Spitzer, Klaus

    2008-03-01

    A variety of software exists to interpret files or directories compliant to the Digital Imaging and Communications in Medicine (DICOM) standard and display them as individual images or volume rendered objects. Some of them offer further processing and analysis features. The surveys that have been published so far are partly not up-to-date anymore, and neither a detailed description of the software functions nor a comprehensive comparison is given. This paper aims at evaluation and comparison of freely available, non-diagnostic DICOM software with respect to the following aspects: (i) data import; (ii) data export; (iii) header viewing; (iv) 2D image viewing; (v) 3D volume viewing; (vi) support; (vii) portability; (viii) workability; and (ix) usability. In total, 21 tools were included: 3D Slicer, AMIDE, BioImage Suite, DicomWorks, EViewBox, ezDICOM, FPImage, ImageJ, JiveX, Julius, MedImaView, MedINRIA, MicroView, MIPAV, MRIcron, Osiris, PMSDView, Syngo FastView, TomoVision, UniViewer, and XMedCon. Our results in table form can ease the selection of appropriate DICOM software tools. In particular, we discuss use cases for the inexperienced user, data conversion, and volume rendering, and suggest Syngo FastView or PMSDView, DicomWorks or XMedCon, and ImageJ or UniViewer, respectively.

  6. Westinghouse Waste Simulation and Optimization Software Tool - 13493

    SciTech Connect

    Mennicken, Kim; Aign, Joerg

    2013-07-01

    Radioactive waste is produced during NPP operation and NPP D and D. Different kinds of waste with different volumes and properties have to be treated. Finding a technically and commercially optimized waste treatment concept is a difficult and time consuming process. The Westinghouse waste simulation and optimization software tool is an approach to study the total life cycle cost of any waste management facility. The tool enables the user of the simulation and optimization software to plan processes and storage buildings and to identify bottlenecks in the overall waste management design before starting detailed planning activities. Furthermore, application of the software enables the user to optimize the number of treatment systems, to determine the minimum design capacity for onsite storage facilities, to identify bottlenecks in the overall design and to identify the most cost-effective treatment paths by maintaining optimal waste treatment technologies. In combination with proven waste treatment equipment and integrated waste management solutions, the waste simulation and optimization software provides reliable qualitative results that lead to an effective planning and minimization of the total project planning risk of any waste management activity. (authors)

  7. Software Certification for Temporal Properties With Affordable Tool Qualification

    NASA Technical Reports Server (NTRS)

    Xia, Songtao; DiVito, Benedetto L.

    2005-01-01

    It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.

  8. NTRFinder: a software tool to find nested tandem repeats.

    PubMed

    Matroud, Atheer A; Hendy, M D; Tuffley, C P

    2012-02-01

    We introduce the software tool NTRFinder to search for a complex repetitive structure in DNA we call a nested tandem repeat (NTR). An NTR is a recurrence of two or more distinct tandem motifs interspersed with each other. We propose that NTRs can be used as phylogenetic and population markers. We have tested our algorithm on both real and simulated data, and present some real NTRs of interest. NTRFinder can be downloaded from http://www.maths.otago.ac.nz/~aamatroud/. PMID:22121222

  9. An Open Source Software Tool for Hydrologic Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Park, Dong Kwan; Shin, Mun-Ju; Kim, Young-Oh

    2015-04-01

    With the Intergovernmental Panel on Climate Change (IPCC) publishing Climate Change Assessment Reports containing updated forecasts and scenarios regularly, it is necessary to also periodically perform hydrologic assessments studies on these scenarios. The practical users including scientists and government people need to use handy tools that operate from climate input data of historical observations and climate change scenarios to rainfall-runoff simulation and assessment periodically. We propose HydroCAT (Hydrologic Climate change Assessment Tool), which is a flexible software tool designed to simplify and streamline hydrologic climate change assessment studies with the incorporation of: taking climate input values from general circulation models using the latest climate change scenarios; simulation of downscaled values using statistical downscaling methods; calibration and simulation of well-know multiple lumped conceptual hydrologic models; assessment of results using statistical methods. This package is designed in an open source, R-based, software package that includes an operating framework to support wide data frameworks, variety of hydrologic models, and climate change scenarios. The use of the software is demonstrated in a case study of the Geum River basin in Republic of Korea.

  10. CAD/CAM software for an industrial laser manufacturing tool

    NASA Astrophysics Data System (ADS)

    Stassen Boehlen, Ines; Fieret, Jim; Holmes, Andrew S.; Lee, Kin W.

    2003-07-01

    A facility for rapid prototyping of MEMS devices is crucial for the development of novel miniaturized components in all sectors of high-tech industry, e.g. telecommunications, information technology, micro-optics and aerospace. To overcome the disadvantages of existing techniques in terms of cost and flexibility, a new approach has been taken to provide a tool for rapid prototyping and small-scale production: Complex CAD/CAM software has been developed that automatically generates the tool paths according to a CAD drawing of the MEMS device. As laser ablation is a much more complicated process than mechanical machining, for which such software has already been in use for many years, the generation of these tool paths relies not only on geometric considerations, but also on a sophisticated simulation module taking into account various material and laser parameters and micro-effects. The following laser machining options have been implemented: cutting, hole drilling, slot cutting, 2D area clearing, pocketing and 2½D surface machining. Once the tool paths are available, a post processor translates this information into CNC commands that control a scanner head. This scanner head then guides the beam of a UV solid-state laser to machine the desired structure by direct laser ablation.

  11. Knowledge-engineering software. A demonstration of a high-end tool.

    PubMed

    Salzman, G C; Krall, R B; Marinuzzi, J G

    1988-06-01

    Many investigators wanting to apply knowledge-based systems (KBSs) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with such high-end KBS tools as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. This paper illustrates the capability of one of these high-end tools. First, a table showing the classification of benign soft tissue tumors was converted into a KEE knowledge base. The tools available in KEE were then used to identify the tumor type for a hypothetical patient. PMID:3408548

  12. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  13. Classroom Live: a software-assisted gamification tool

    NASA Astrophysics Data System (ADS)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  14. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    PubMed

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes. PMID:24364196

  15. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  16. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems. PMID:14534409

  17. Northwestern University Schizophrenia Data and Software Tool (NUSDAST)

    PubMed Central

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I.; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions. PMID:24223551

  18. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    PubMed

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions. PMID:24223551

  19. An Approach to Building a Traceability Tool for Software Development

    NASA Technical Reports Server (NTRS)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional

  20. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  1. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  2. A software tool for graphically assembling damage identification algorithms

    NASA Astrophysics Data System (ADS)

    Allen, David W.; Clough, Joshua A.; Sohn, Hoon; Farrar, Charles R.

    2003-08-01

    At Los Alamos National Laboratory (LANL), various algorithms for structural health monitoring problems have been explored in the last 5 to 6 years. The original DIAMOND (Damage Identification And MOdal aNalysis of Data) software was developed as a package of modal analysis tools with some frequency domain damage identification algorithms included. Since the conception of DIAMOND, the Structural Health Monitoring (SHM) paradigm at LANL has been cast in the framework of statistical pattern recognition, promoting data driven damage detection approaches. To reflect this shift and to allow user-friendly analyses of data, a new piece of software, DIAMOND II is under development. The Graphical User Interface (GUI) of the DIAMOND II software is based on the idea of GLASS (Graphical Linking and Assembly of Syntax Structure) technology, which is currently being implemented at LANL. GLASS is a Java based GUI that allows drag and drop construction of algorithms from various categories of existing functions. In the platform of the underlying GLASS technology, DIAMOND II is simply a module specifically targeting damage identification applications. Users can assemble various routines, building their own algorithms or benchmark testing different damage identification approaches without writing a single line of code.

  3. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  4. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  5. A survey on open source software testing tools: a preliminary study in 2011

    NASA Astrophysics Data System (ADS)

    Emami, Seyed Amir; Sim, Jason Chin Lung; Sim, Kwan Yong

    2011-12-01

    Software Testing is a costly and time consuming process in software development. Therefore, software testing tools are often deployed to automate the process in order to reduce cost and improve efficiency. However, many of them are proprietary and expensive. Hence, open source software testing tools could be an appealing alternative. In this paper, we survey the current states of open source software testing tools from three aspects, namely, their availability for different programming platforms and types testing activities, maintenance of the tools and license limitations. From the 152 tools surveyed, we found that open source software testing tools not only are widely available for popular programming platforms, but also support a wide range of testing activities. Furthermore, we also found that more than half of the tools surveyed have been actively maintained and updated by the open source communities. Finally, these tools have very few licensing limitations for commercial use, customization and redistribution.

  6. User Guide for the STAYSL PNNL Suite of Software Tools

    SciTech Connect

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  7. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  8. MUST - An integrated system of support tools for research flight software engineering. [Multipurpose User-oriented Software Technology

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.

  9. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    SciTech Connect

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  10. ELER software - a new tool for urban earthquake loss assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  11. Hardware additions to microprocessor architecture aid software development

    NASA Technical Reports Server (NTRS)

    Sievers, M. W.

    1976-01-01

    An address trap (breakpoint) mechanism and last-in-first-out (LIFO) address stack are suggested as two additions to the basic microprocessor architecture whose functions are solely to aid the programmer. These devices provide the programmer with the ability to specify address breakpoints and to trace program execution back through N instructions, where N is the depth of the stack. Both devices, plus interface logic and buffering, have been designed for an INTEL 8080-based system using approximately 25 integrated-circuit packages.

  12. Software tools for simultaneous data visualization and T cell epitopes and disorder prediction in proteins.

    PubMed

    Jandrlić, Davorka R; Lazić, Goran M; Mitić, Nenad S; Pavlović, Mirjana D

    2016-04-01

    We have developed EpDis and MassPred, extendable open source software tools that support bioinformatic research and enable parallel use of different methods for the prediction of T cell epitopes, disorder and disordered binding regions and hydropathy calculation. These tools offer a semi-automated installation of chosen sets of external predictors and an interface allowing for easy application of the prediction methods, which can be applied either to individual proteins or to datasets of a large number of proteins. In addition to access to prediction methods, the tools also provide visualization of the obtained results, calculation of consensus from results of different methods, as well as import of experimental data and their comparison with results obtained with different predictors. The tools also offer a graphical user interface and the possibility to store data and the results obtained using all of the integrated methods in the relational database or flat file for further analysis. The MassPred part enables a massive parallel application of all integrated predictors to the set of proteins. Both tools can be downloaded from http://bioinfo.matf.bg.ac.rs/home/downloads.wafl?cat=Software. Appendix A includes the technical description of the created tools and a list of supported predictors. PMID:26851400

  13. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  14. A software tool for rapid flood inundation mapping

    USGS Publications Warehouse

    Verdin, James; Verdin, Kristine; Mathis, Melissa; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-01-01

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  15. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, Gregory; Ferguson, B. Lynn

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results and were generated using a Yates analysis. A previous study had indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledgebase and developing a procedure to extract rules to aid in TBC design.

  16. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, G.; Ferguson, B. L.

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, Task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results, and were generated using a Yates analysis. A previous study has indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledge base and developing a procedure to extract rules to aid in TBC design.

  17. A software tool for the analysis of neuronal morphology data

    PubMed Central

    2014-01-01

    Anatomy plays a fundamental role in supporting and shaping nervous system activity. The remarkable progress of computer processing power within the last two decades has enabled the generation of electronic databases of complete three-dimensional (3D) dendritic and axonal morphology for neuroanatomical studies. Several laboratories are freely posting their reconstructions online after result publication v.gr. NeuroMorpho.Org (Nat Rev Neurosci7:318–324, 2006). These neuroanatomical archives represent a crucial resource to explore the relationship between structure and function in the brain (Front Neurosci6:49, 2012). However, such 'Cartesian’ descriptions bear little intuitive information for neuroscientists. Here, we developed a simple prototype of a MATLAB-based software tool to quantitatively describe the 3D neuronal structures from public repositories. The program imports neuronal reconstructions and quantifies statistical distributions of basic morphological parameters such as branch length, tortuosity, branch's genealogy and bifurcation angles. Using these morphological distributions, our algorithm can generate a set of virtual neurons readily usable for network simulations. PMID:24529393

  18. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  19. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    NASA Astrophysics Data System (ADS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-02-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students.

  20. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  1. Kid Tools: Self-Management, Problem- Solving, Organizational, and Planning Software for Children and Teachers

    ERIC Educational Resources Information Center

    Miller, Kevin J.; Fitzgerald, Gail E.; Koury, Kevin A.; Mitchem, Herine J.; Hollingsead, Candice

    2007-01-01

    This article provides an overview of KidTools, an electronic performance software system designed for elementary and middle school children to use independently on classroom or home computers. The software system contains 30 computerized research-based strategy tools that can be implemented in a classroom or home environment. Through the…

  2. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  3. Technology Pedagogy: Software Tools for Teaching and Learning

    ERIC Educational Resources Information Center

    Berry, James; Staub, Nancy

    2011-01-01

    Adoption of technology for teaching and learning is not as significant as the adoption and use of software that is used as a pedagogical extension of a teacher's approach to classroom instruction. It is the dynamic and integrated use of software that extends the pedagogical role of the teacher beyond the traditional lecture and discussion format.…

  4. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  5. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  6. Robust Optimal Design of Experiments for Model Discrimination Using an Interactive Software Tool

    PubMed Central

    Stegmaier, Johannes; Skanda, Dominik; Lebiedz, Dirk

    2013-01-01

    Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU General Public License

  7. Robust optimal design of experiments for model discrimination using an interactive software tool.

    PubMed

    Stegmaier, Johannes; Skanda, Dominik; Lebiedz, Dirk

    2013-01-01

    Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU General Public License

  8. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  9. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics.

    PubMed

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R; Chen, Albert J; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G; Liu, Xiaowen; Ge, Ying

    2016-02-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  10. Emerging role of bioinformatics tools and software in evolution of clinical research

    PubMed Central

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  11. Emerging role of bioinformatics tools and software in evolution of clinical research.

    PubMed

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  12. Development of Automatic Testing Tool for `Design & Coding Standard' for Railway Signaling Software

    NASA Astrophysics Data System (ADS)

    Hwang, Jong-gyu; Jo, Hyun-jeong

    2009-08-01

    In accordance with the development of recent computer technology, the dependency of railway signaling system on the computer software is being increased further, and accordingly, the testing for the safety and reliability of railway signaling system software became more important. This thesis suggested an automated testing tool for coding rules on this railway signaling system software, and presented its result of implementation. The testing items in the implemented tool had referred to the international standards in relation to the software for railway system and MISRA-C standards. This automated testing tool for railway signaling system can be utilized at the assessment stage for railway signaling system software also, and it is anticipated that it can be utilized usefully at the software development stage also.

  13. Supporting student nurses in practice with additional online communication tools.

    PubMed

    Morley, Dawn A

    2014-01-01

    Student nurses' potential isolation and difficulties of learning on placement have been well documented and, despite attempts to make placement learning more effective, evidence indicates the continuing schism between formal learning at university and situated learning on placement. First year student nurses, entering placement for the first time, are particularly vulnerable to the vagaries of practice. During 2012 two first year student nurse seminar groups (52 students) were voluntarily recruited for a mixed method study to determine the usage of additional online communication support mechanisms (Facebook, wiki, an email group and traditional methods of support using individual email or phone) while undertaking their first five week clinical placement. The study explores the possibility of strengthening clinical learning and support by promoting the use of Web 2.0 support groups for student nurses. Results indicate a high level of interactivity in both peer and academic support in the use of Facebook and a high level of interactivity in one wiki group. Students' qualitative comments voice an appreciation of being able to access university and peer support whilst working individually on placement. Recommendations from the study challenge universities to use online communication tools already familiar to students to complement the support mechanisms that exist for practice learning. This is tempered by recognition of the responsibility of academics to ensure their students are aware of safe and effective online communication. PMID:23871299

  14. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  15. OpenROCS: a software tool to control robotic observatories

    NASA Astrophysics Data System (ADS)

    Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere

    2012-09-01

    We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).

  16. CUSTOMER RESPONSE TO BESTPRACTICES TRAINING AND SOFTWARE TOOLS PROVIDED BY DOE'S INDUSTRIAL TECHNOLOGIES PROGRAM

    SciTech Connect

    Schweitzer, Martin; Martin, Michaela A; Schmoyer, Richard L

    2008-03-01

    The BestPractices program area, which has evolved into the Save Energy Now (SEN) Initiative, is a component of the U.S. Department of Energy's (DOE's) Industrial Technologies Program (ITP) that provides technical assistance and disseminates information on energy-efficient technologies and practices to U.S. industrial firms. The BestPractices approach to information dissemination includes conducting training sessions which address energy-intensive systems (compressed air, steam, process heat, pumps, motors, and fans) and distributing DOE software tools on those same topics. The current report documents a recent Oak Ridge National Laboratory (ORNL) study undertaken to determine the implementation rate, attribution rate, and reduction factor for industrial end-users who received BestPractices training and registered software in FY 2006. The implementation rate is the proportion of service recipients taking energy-saving actions as a result of the service received. The attribution rate applies to those individuals taking energy-saving actions as a result of the services received and represents the portion of the savings achieved through those actions that is due to the service. The reduction factor is the saving that is realized from program-induced measures as a proportion of the potential savings that could be achieved if all service recipients took action. In addition to examining those factors, the ORNL study collected information on selected characteristics of service recipients, the perceived value of the services provided, and the potential energy savings that can be achieved through implementation of measures identified from the training or software. Because the provision of training is distinctly different from the provision of software tools, the two efforts were examined independently and the findings for each are reported separately.

  17. Klonos: A Similarity Analysis Based Tool for Software Porting

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  18. RFcap: a software analysis tool for multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Dillier, Norbert

    2013-03-01

    Being able to display and analyse the output of a speech processor that encodes the parameters of complex stimuli to be presented by a cochlear implant (CI) is useful for software and hardware development as well as for diagnostic purposes. This firstly requires appropriate hardware that is able to receive and decode the radio frequency (RF)-coded signals, and then processing the decoded data using suitable software. The PCI-IF6 clinical hardware for the Nucleus CI system, together with the Nucleus Implant Communicator and Nucleus Matlab Toolbox research software libraries, provide the necessary functionality. RFcap is a standalone Matlab application that encapsulates the relevant functions to capture, display, and analyse the RF-coded signals intended for the Nucleus CI24M/R, CI24RE, and CI500 multichannel CIs. PMID:21762546

  19. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  20. adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.

    2004-01-01

    A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.

  1. Software Validation, Verification, and Testing Technique and Tool Reference Guide. Final Report.

    ERIC Educational Resources Information Center

    Powell, Patricia B., Ed.

    Intended as an aid in the selection of software techniques and tools, this document contains three sections: (1) a suggested methodology for the selection of validation, verification, and testing (VVT) techniques and tools; (2) summary matrices by development phase usage, a table of techniques and tools with associated keywords, and an…

  2. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  3. A Study of Collaborative Software Development Using Groupware Tools

    ERIC Educational Resources Information Center

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  4. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  5. An experiment in software reliability: Additional analyses using data from automated replications

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Lauterbach, Linda A.

    1988-01-01

    A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.

  6. AWG-Parameters: new software tool to design arrayed waveguide gratings

    NASA Astrophysics Data System (ADS)

    Seyringer, D.; Bielik, M.

    2013-03-01

    A new software tool and its application in the design of optical multiplexers/demultiplexers based on arrayed waveguide gratings is presented. The motivation for this work is the fact that when designing arrayed waveguide gratings a set of geometrical parameters must be first calculated. These parameters are the input for AWG layout that will be created and simulated using commercial photonic design tools. It is important to point out that these parameters influence strongly correct AWG demultiplexing properties and therefore have to be calculated very carefully. However, most of the commercial photonic design tools do not support this fundamental calculation. To be able to design any AWG, with any software tool and particularly to save the time needed for AWG design a new software tool was developed. The tool was already applied in various AWG designs and also technologically well-proven.

  7. Software development tools for the CDF MX scanner

    SciTech Connect

    Stuermer, W.; Turner, K.; Littleton-Sestini, S.

    1991-11-01

    This paper discuses the design of the high level assembler and diagnostic control program developed for the MX, a high speed, custom designed computer used in the CDF data acquisition system at Fermilab. These programs provide a friendly productive environment for the development of software on the MX. Details of their implementation and special features, and some of the lessons learned during their development are included.

  8. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  9. Focus: Design and Evaluation of a Software Tool for Collecting Reader Feedback.

    ERIC Educational Resources Information Center

    de Jong, Menno; Lentz, Leo

    2001-01-01

    Describes "Focus," a software tool for collecting reader comments more efficiently. Discusses the design and rationale of the software. Notes that results obtained using Focus were compared to the reader feedback collected under the plus-minus method. Concludes that Focus participants appeared to comment more from a reviewer's and less from a…

  10. Slower Algebra Students Meet Faster Tools: Solving Algebra Word Problems with Graphing Software

    ERIC Educational Resources Information Center

    Yerushalmy, Michal

    2006-01-01

    The article discusses the ways that less successful mathematics students used graphing software with capabilities similar to a basic graphing calculator to solve algebra problems in context. The study is based on interviewing students who learned algebra for 3 years in an environment where software tools were always present. We found differences…

  11. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    ERIC Educational Resources Information Center

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  12. Software tools for the analysis of video meteors emission spectra

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Toscano, F. M.; Trigo-Rodriguez, J. M.

    2011-10-01

    One of the goals of the SPanish Meteor Network (SPMN) is related to the study of the chemical composition of meteoroids by analyzing the emission spectra resulting from the ablation of these particles of interplanetary matter in the atmosphere. With this aim, some of the CCD video devices we employ to observe the nigh sky are endowed with holographic diffraction gratings, and a continuous monitoring of meteor activity is performed. We have recently developed a new software to analyze these spectra. A description of this computer program is given, and some of the results obtained so far are presented here.

  13. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  14. The MineTool Software Suite: A Novel Data Mining Palette of Tools for Automated Modeling of Space Physics Data

    NASA Astrophysics Data System (ADS)

    Sipes, T.; Karimabadi, H.; Roberts, A.

    2009-12-01

    We present a new data mining software tool called MineTool for analysis and modeling of space physics data. MineTool is a graphical user interface implementation that merges two data mining algorithms into an easy-to-use software tool: an algorithm for analysis and modeling of static data [Karimabadi et al, 2007] and MineTool-TS, an algorithm for data mining of time series data [Karimabadi et al, 2009]. By virtue of automating the modeling process and model evaluations, MineTool makes data mining and predictive modeling more accessible to non-experts. The software is entirely in Java and freeware. By ranking all inputs as predictors of the outcome before constructing a model, MineTool enables inclusion of only relevant variables as well. The technique aggregates the various stages of model building into a four-step process consisting of (i) data segmentation and sampling, (ii) variable pre-selection and transform generation, (iii) predictive model estimation and validation, and (iv) final model selection. Optimal strategies are chosen for each modeling step. A notable feature of the technique is that the final model is always in closed analytical form rather than “black box” form characteristic of some other techniques. Having the analytical model enables deciphering the importance of various variables to affecting the outcome. MineTool suite also provides capabilities for data preparation for data mining as well as visualization of the datasets. MineTool has successfully been used to develop models for automated detection of flux transfer events (FTEs) at Earth’s magnetopause in the Cluster spacecraft time series data and 3D magnetopause modeling. In this presentation, we demonstrate the ease of use of the software through examples including how it was used in the FTE problem.

  15. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  16. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  17. FIND: A new software tool and development platform for enhanced multicolor flow analysis

    PubMed Central

    2011-01-01

    Background Flow Cytometry is a process by which cells, and other microscopic particles, can be identified, counted, and sorted mechanically through the use of hydrodynamic pressure and laser-activated fluorescence labeling. As immunostained cells pass individually through the flow chamber of the instrument, laser pulses cause fluorescence emissions that are recorded digitally for later analysis as multidimensional vectors. Current, widely adopted analysis software limits users to manual separation of events based on viewing two or three simultaneous dimensions. While this may be adequate for experiments using four or fewer colors, advances have lead to laser flow cytometers capable of recording 20 different colors simultaneously. In addition, mass-spectrometry based machines capable of recording at least 100 separate channels are being developed. Analysis of such high-dimensional data by visual exploration alone can be error-prone and susceptible to unnecessary bias. Fortunately, the field of Data Mining provides many tools for automated group classification of multi-dimensional data, and many algorithms have been adapted or created for flow cytometry. However, the majority of this research has not been made available to users through analysis software packages and, as such, are not in wide use. Results We have developed a new software application for analysis of multi-color flow cytometry data. The main goals of this effort were to provide a user-friendly tool for automated gating (classification) of multi-color data as well as a platform for development and dissemination of new analysis tools. With this software, users can easily load single or multiple data sets, perform automated event classification, and graphically compare results within and between experiments. We also make available a simple plugin system that enables researchers to implement and share their data analysis and classification/population discovery algorithms. Conclusions The FIND (Flow

  18. An internet-based software tool for submitting crime information to forensic laboratories

    NASA Astrophysics Data System (ADS)

    Ahluwalia, Rashpal S.; Govindarajulu, Sriram

    2004-11-01

    This paper describes an internet-based software tool developed for the West Virginia State Police Forensics Laboratory. The software enables law enforcement agents to submit crime information to the Forensic Laboratory via a secure Internet connection. Online electronic forms were created to mirror the existing paper based forms, making the transition easier. The process of submitting case information was standardized and streamlined, there by minimizing information inconsistency. The crime information once gathered is automatically stored in a database, and can be viewed and queried by any authorized law enforcement officers. The software tool will be deployed in all counties of WV.

  19. The Web Interface Template System (WITS), a software developer`s tool

    SciTech Connect

    Lauer, L.J.; Lynam, M.; Muniz, T.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  20. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    PubMed

    Ribes, Delphine; Parafita, Julia; Charrier, Rémi; Magara, Fulvio; Magistretti, Pierre J; Thiran, Jean-Philippe

    2010-01-01

    In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool. PMID:21124830

  1. RadicalLocator: A software tool for identifying the radicals in Chinese characters.

    PubMed

    Yu, Lili; Reichle, Erik D; Jones, Mathew; Liversedge, Simon P

    2015-09-01

    This article describes a new software tool called RadicalLocator that can be used to automatically identify (e.g., for visual inspection) individual target radicals (i.e., groups of strokes) in written Chinese characters. We first briefly clarify why this software is useful for research purposes and discuss the factors that make this pattern recognition task so difficult. We then describe how the software can be downloaded and installed, and used to identify the radicals in characters for the purposes of, for example, selecting materials for psycholinguistic experiments. Finally, we discuss several known limitations of the software and heuristics for addressing them. PMID:25169830

  2. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  3. A Tool to Enhance Cooperation and Knowledge Transfer among Software Developers

    NASA Astrophysics Data System (ADS)

    Aydin, Seçil; Mishra, Deepti

    Software developers have been successfully tailoring software development methods according to the project situation and more so in small scale software development organizations. There is a need to share this knowledge with other developers who may be facing the same project situation so that they can benefit from other people experiences. In this paper, an approach to enhance cooperation among software developers, in terms of sharing the knowledge that was used successfully in past projects, is proposed. A web-based tool is developed that can assist in creation, storage and extraction of methods related with requirement elicitation phase. These methods are categorized according to certain criteria which helps in searching a method that will be most appropriate in a given project situation. This approach and tool can also be used for other software development activities.

  4. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  5. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  6. Tool School. Review Software for Basic CHOICE. CHOICE (Challenging Options in Career Education).

    ERIC Educational Resources Information Center

    Pitts, Ilse M.; And Others

    CHOICE Tool School is an Apple computer software program designed to reinforce job and role information presented to primary-aged migrant students in the Basic Job and Role activity folders and workbooks. Learners must decide if randomly displayed tools are or are not used by the worker selected for the game theme. Learners may choose the level of…

  7. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  8. Development of a software tool for an internal dosimetry using MIRD method

    NASA Astrophysics Data System (ADS)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  9. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  10. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    NASA Technical Reports Server (NTRS)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  11. A software tool for tomographic axial superresolution in STED microscopy.

    PubMed

    Koho, S; Deguchi, T; Hänninen, P E

    2015-11-01

    A method for generating three-dimensional tomograms from multiple three-dimensional axial projections in STimulated Emission Depletion (STED) superresolution microscopy is introduced. Our STED< method, based on the use of a micromirror placed on top of a standard microscopic sample, is used to record a three-dimensional projection at an oblique angle in relation to the main optical axis. Combining the STED< projection with the regular STED image into a single view by tomographic reconstruction, is shown to result in a tomogram with three-to-four-fold improved apparent axial resolution. Registration of the different projections is based on the use of a mutual-information histogram similarity metric. Fusion of the projections into a single view is based on Richardson-Lucy iterative deconvolution algorithm, modified to work with multiple projections. Our tomographic reconstruction method is demonstrated to work with real biological STED superresolution images, including a data set with a limited signal-to-noise ratio (SNR); the reconstruction software (SuperTomo) and its source code will be released under BSD open-source license. PMID:26258639

  12. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  13. Automated software development tools in the MIS (Management Information Systems) environment

    SciTech Connect

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  14. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  15. Methods and software tools for design evaluation in population pharmacokinetics–pharmacodynamics studies

    PubMed Central

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)–pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization. PMID:24548174

  16. Methods and software tools for design evaluation in population pharmacokinetics-pharmacodynamics studies.

    PubMed

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)-pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization. PMID:24548174

  17. A diagnostic tool for malaria based on computer software

    PubMed Central

    Kotepui, Manas; Uthaisar, Kwuntida; Phunphuech, Bhukdee; Phiwklam, Nuoil

    2015-01-01

    Nowadays, the gold standard method for malaria diagnosis is a staining of thick and thin blood film examined by expert laboratorists. It requires well-trained laboratorists, which is a time consuming task, and is un-automated protocol. For this study, Maladiag Software was developed to predict malaria infection in suspected malaria patients. The demographic data of patients, examination for malaria parasites, and complete blood count (CBC) profiles were analyzed. Binary logistic regression was used to create the equation for the malaria diagnosis. The diagnostic parameters of the equation were tested on 4,985 samples (703 infected and 4,282 control samples). The equation indicated 81.2% sensitivity and 80.3% specificity for predicting infection of malaria. The positive likelihood and negative likelihood ratio were 4.12 (95% CI = 4.01–4.23) and 0.23 (95% CI = 0.22–0.25), respectively. This parameter also had odds ratios (P value < 0.0001, OR = 17.6, 95% CI = 16.0–19.3). The equation can predict malaria infection after adjust for age, gender, nationality, monocyte (%), platelet count, neutrophil (%), lymphocyte (%), and the RBC count of patients. The diagnostic accuracy was 0.877 (Area under curve, AUC) (95% CI = 0.871–0.883). The system, when used in combination with other clinical and microscopy methods, might improve malaria diagnoses and enhance prompt treatment. PMID:26559606

  18. Non invasive ventilation as an additional tool for exercise training.

    PubMed

    Ambrosino, Nicolino; Cigni, Paolo

    2015-01-01

    Recently, there has been increasing interest in the use of non invasive ventilation (NIV) to increase exercise capacity. In individuals with COPD, NIV during exercise reduces dyspnoea and increases exercise tolerance. Different modalities of mechanical ventilation have been used non-invasively as a tool to increase exercise tolerance in COPD, heart failure and lung and thoracic restrictive diseases. Inspiratory support provides symptomatic benefit by unloading the ventilatory muscles, whereas Continuous Positive Airway Pressure (CPAP) counterbalances the intrinsic positive end-expiratory pressure in COPD patients. Severe stable COPD patients undergoing home nocturnal NIV and daytime exercise training showed some benefits. Furthermore, it has been reported that in chronic hypercapnic COPD under long-term ventilatory support, NIV can also be administered during walking. Despite these results, the role of NIV as a routine component of pulmonary rehabilitation is still to be defined. PMID:25874110

  19. PAW, a general-purpose portable software tool for data analysis and presentation

    NASA Astrophysics Data System (ADS)

    Brun, René; Couet, Olivier; Vandoni, Carlo E.; Zanarini, Pietro

    1989-12-01

    During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to high energy physics (HEP). The results of the integration of resources from many different laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW - Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years.

  20. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    PubMed

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization. PMID:23018409

  1. Software Mapping Assessment Tool Documenting Behavioral Content in Computer Interaction: Examples of Mapped Problems with "Kid Pix" Program

    ERIC Educational Resources Information Center

    Bayram, Servet

    2005-01-01

    The purpose of software mapping is to delineate a method for software menu, tool, and palette use in the construction of elementary school science and mathematics curriculum activities. With this method, software "maps" were created for traversing science and math curriculum problems and activities using software. The other purpose of…

  2. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  3. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses. PMID:27487387

  4. Characterizing Verification Tools Through Coding Error Candidates Reported in Space Flight Software

    NASA Astrophysics Data System (ADS)

    Prause, Christian R.; Gerlich, Ralf; Gerlich, Rainer; Fischer, Anton

    2015-09-01

    Mastering the continuously increasing amount of software requires identification of more efficient strategies for software verification. Currently, fault coverage is only indirectly addressed, e.g. by code coverage. The idea as presented in this paper is to get a better understanding of fault coverage by a systematic classification of software fault types, derivation of footprints of verification tools regarding coverage of such fault types, and recording of required effort. A number of issues regarding fault identification and classification are discussed in this context.

  5. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  6. Proceedings of the Workshop on software tools for distributed intelligent control systems

    SciTech Connect

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  7. GenomeTools: a comprehensive software library for efficient processing of structured genome annotations.

    PubMed

    Gremme, Gordon; Steinbiss, Sascha; Kurtz, Stefan

    2013-01-01

    Genome annotations are often published as plain text files describing genomic features and their subcomponents by an implicit annotation graph. In this paper, we present the GenomeTools, a convenient and efficient software library and associated software tools for developing bioinformatics software intended to create, process or convert annotation graphs. The GenomeTools strictly follow the annotation graph approach, offering a unified graph-based representation. This gives the developer intuitive and immediate access to genomic features and tools for their manipulation. To process large annotation sets with low memory overhead, we have designed and implemented an efficient pull-based approach for sequential processing of annotations. This allows to handle even the largest annotation sets, such as a complete catalogue of human variations. Our object-oriented C-based software library enables a developer to conveniently implement their own functionality on annotation graphs and to integrate it into larger workflows, simultaneously accessing compressed sequence data if required. The careful C implementation of the GenomeTools does not only ensure a light-weight memory footprint while allowing full sequential as well as random access to the annotation graph, but also facilitates the creation of bindings to a variety of script programming languages (like Python and Ruby) sharing the same interface. PMID:24091398

  8. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. PMID:25359577

  9. Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification

    NASA Astrophysics Data System (ADS)

    Lopes, R.; Vicente, D.; Silva, N.

    2009-05-01

    Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

  10. Management of an affiliated Physics Residency Program using a commercial software tool.

    PubMed

    Zacarias, Albert S; Mills, Michael D

    2010-01-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years. PMID:20717075

  11. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  12. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    SciTech Connect

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  13. sigTOOL: A MATLAB-based environment for sharing laboratory-developed software to analyze biological signals.

    PubMed

    Lidierth, Malcolm

    2009-03-30

    This paper describes a software package, named sigTOOL, for processing biological signals. The package runs in the MATLAB programming environment and has been designed to promote the sharing of laboratory-developed software across the worldwide web. As proof-of-concept of the design of the system, sigTOOL has been used to build an analysis application for dealing with neuroscience data complete with a user-friendly graphical user interface which implements a range of waveform and spike-train analysis functions. The interface allows many commonly used neuroscience data file formats to be loaded (including those of Alpha Omega, Cambridge Electronic Design, Cyberkinetics Inc., Molecular Devices, Nex Technologies and Plexon Instruments). Waveform analysis functions selectable from the interface support waveform averaging (mean and median), auto- and cross-correlation, power spectral analysis, coherence estimation, digital filtering (feedback and feedforward) and resampling. Spike-train analyses include interspike interval distributions, Poincaré plots, event auto- and cross-correlations, spike-triggered averaging, stimulus driven and phase-related peri-event time histograms and rasters as well as frequencygrams. User-developed additions to sigTOOL that are archived and distributed electronically will be added to the sigTOOL interface on-the-fly, without the need to modify the core sigTOOL code. Full sigTOOL functionality will be provided to support the user-developed code, including the ability to record a user action history for batch processing of files and support for exporting the results of analyses to external graphics editing software and spreadsheet-based data processing packages. PMID:19056423

  14. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  15. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  16. A Guide to the Use of Tool Software for the Apple Computer.

    ERIC Educational Resources Information Center

    Collett, Charles R.; Goldberg, Fred S.

    Designed to give teachers and supervisors a working knowledge of various approaches to enhancing pupil learning through software application programs, this guide is presented in a hands-on fashion. It supports a dual purpose, i.e., it can serve as an individual tutorial or as a turnkey staff development tool. All program files referred to may be…

  17. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    ERIC Educational Resources Information Center

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  18. DairyGEM: A software tool for assessing emissions and mitigation strategies for dairy production systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many gaseous compounds are emitted from dairy farms. Those of current interest include the toxic compounds of ammonia and hydrogen sulfide and the greenhouse gases of methane, nitrous oxide and carbon dioxide. A relatively easy to use software tool was developed that predicts these emissions through...

  19. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    PubMed

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  20. mMass as a Software Tool for the Annotation of Cyclic Peptide Tandem Mass Spectra

    PubMed Central

    Niedermeyer, Timo H. J.; Strohalm, Martin

    2012-01-01

    Natural or synthetic cyclic peptides often possess pronounced bioactivity. Their mass spectrometric characterization is difficult due to the predominant occurrence of non-proteinogenic monomers and the complex fragmentation patterns observed. Even though several software tools for cyclic peptide tandem mass spectra annotation have been published, these tools are still unable to annotate a majority of the signals observed in experimentally obtained mass spectra. They are thus not suitable for extensive mass spectrometric characterization of these compounds. This lack of advanced and user-friendly software tools has motivated us to extend the fragmentation module of a freely available open-source software, mMass (http://www.mmass.org), to allow for cyclic peptide tandem mass spectra annotation and interpretation. The resulting software has been tested on several cyanobacterial and other naturally occurring peptides. It has been found to be superior to other currently available tools concerning both usability and annotation extensiveness. Thus it is highly useful for accelerating the structure confirmation and elucidation of cyclic as well as linear peptides and depsipeptides. PMID:23028676

  1. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated...

  2. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    ERIC Educational Resources Information Center

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  3. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  4. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    PubMed

    Kong, Xiaoquan; Huang, Minyi; Duan, Renyan

    2015-01-01

    It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs). There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata) to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF) and check species names and the accuracy of coordinates (latitude and longitude). It is an open source software (GNU Affero General Public License/AGPL licensed) allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from . PMID:26030926

  5. Splash: A Software Tool for Stereotactic Planning of Recording Chamber Placement and Electrode Trajectories

    PubMed Central

    Sperka, Daniel J.; Ditterich, Jochen

    2011-01-01

    While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories. PMID:21472085

  6. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records

    PubMed Central

    Kong, Xiaoquan; Huang, Minyi; Duan, Renyan

    2015-01-01

    It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs). There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata) to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF) and check species names and the accuracy of coordinates (latitude and longitude). It is an open source software (GNU Affero General Public License/AGPL licensed) allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from . PMID:26030926

  7. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  8. Diva software, a tool for European regional seas and Ocean climatologies production

    NASA Astrophysics Data System (ADS)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  9. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    SciTech Connect

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  10. An infrastructure for the creation of high end scientific and engineering software tools and applications

    SciTech Connect

    Drummond, L.A.; Marques, O.A.; Wilson, G.V.

    2003-04-01

    This document has been prepared as a response to the High End Computing Revitalization Task Force (HECRTF) call for white papers. Our main goal is to identify mechanism necessary for the design and implementation of an infrastructure to support development of high-end scientific and engineering software tools and applications. This infrastructure will provide a plethora of software services to facilitate the efficient deployment of future HEC technology as well as collaborations among researchers and engineers across disciplines and institutions. In particular, we address here the following points; Key software technologies that must be advanced to strengthen the foundation for developing new generations of HEC systems. A Software Infrastructure for minimizing ''time to solution'' by users of HEC systems.

  11. RNAsoft: a suite of RNA secondary structure prediction and design software tools

    PubMed Central

    Andronescu, Mirela; Aguirre-Hernández, Rosalía; Condon, Anne; Hoos, Holger H.

    2003-01-01

    DNA and RNA strands are employed in novel ways in the construction of nanostructures, as molecular tags in libraries of polymers and in therapeutics. New software tools for prediction and design of molecular structure will be needed in these applications. The RNAsoft suite of programs provides tools for predicting the secondary structure of a pair of DNA or RNA molecules, testing that combinatorial tag sets of DNA and RNA molecules have no unwanted secondary structure and designing RNA strands that fold to a given input secondary structure. The tools are based on standard thermodynamic models of RNA secondary structure formation. RNAsoft can be found online at http://www.RNAsoft.ca. PMID:12824338

  12. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  13. A software tool for automatic analysis of selected area diffraction patterns within Digital Micrograph™.

    PubMed

    Wu, C H; Reynolds, W T; Murayama, M

    2012-01-01

    A software package "SADP Tools" is developed as a complementary diffraction pattern analysis tool. The core program, called AutoSADP, is designed to facilitate automated measurements of d-spacing and interplaner angles from TEM selected area diffraction patterns (SADPs) of single crystals. The software uses iterative cross correlations to locate the forward scattered beam position and to find the coordinates of the diffraction spots. The newly developed algorithm is suitable for fully automated analysis and it works well with asymmetric diffraction patterns, off-zone axis patterns, patterns with streaks, and noisy patterns such as Fast Fourier transforms of high-resolution images. The AutoSADP tool runs as a macro for the Digital Micrograph program and can determine d-spacing values and interplanar angles based on the pixel ratio with an accuracy of better than about 2%. PMID:22079497

  14. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    PubMed

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast. PMID:26452016

  15. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    PubMed

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  16. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    PubMed

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/. PMID:17707944

  17. The impact of layer thickness on the performance of additively manufactured lapping tools

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2015-10-01

    Lower cost additive manufacturing (AM) machines which have emerged in recent years are capable of producing tools, jigs, and fixtures that are useful in optical fabrication. In particular, AM tooling has been shown to be useful in lapping glass workpieces. Various AM machines are distinguished by the processes, materials, build times, and build resolution they provide. This research investigates the impact of varied build resolution (specifically layer resolution) on the lapping performance of tools built using the stereolithographic assembly (SLA) process in 50 μm and 100 μm layer thicknesses with a methacrylate photopolymer resin on a high resolution desktop printer. As with previous work, the lapping tools were shown to remove workpiece material during the lapping process, but the tools themselves also experienced significant wear on the order of 2-3 times the mass loss of the glass workpieces. The tool wear rates for the 100 μm and 50 μm layer tools were comparable, but the 50 μm layer tool was 74% more effective at removing material from the glass workpiece, which is attributed to some abrasive particles being trapped in the coarser surface of the 100 um layer tooling and not being available to interact with the glass workpiece. Considering the tool wear, these additively manufactured tools are most appropriate for prototype tooling where the low cost (<$45) and quick turnaround make them attractive when compared to a machined tool.

  18. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  19. A review of diffusion tensor magnetic resonance imaging computational methods and software tools.

    PubMed

    Hasan, Khader M; Walimuni, Indika S; Abid, Humaira; Hahn, Klaus R

    2011-12-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  20. Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology

    PubMed Central

    Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron

    2010-01-01

    Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237

  1. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    PubMed Central

    2012-01-01

    Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT) scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifiable 3D volume rendering display plus matching orthogonal 2D cross-sections from DICOM files. The object can be rotated and axes defined and fixed. Predefined lists of landmarks can be loaded and the landmarks identified within any of the representations. Output files are stored in various established formats, depending on the preferred evaluation software. Conclusions The software tool presented here provides several options facilitating the placing of landmarks on 3D objects, including volume rendering from DICOM files, definition and fixation of meaningful axes, easy import, placement, control, and export of landmarks, and handling of large datasets. The TINA Manual Landmark Tool runs under Linux and can be obtained for free from http://www.tina-vision.net/tarballs/. PMID:22480150

  2. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    NASA Astrophysics Data System (ADS)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  3. A Runtime Environment for Supporting Research in Resilient HPC System Software & Tools

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2013-01-01

    The high-performance computing (HPC) community continues to increase the size and complexity of hardware platforms that support advanced scientific workloads. The runtime environment (RTE) is a crucial layer in the software stack for these large-scale systems. The RTE manages the interface between the operating system and the application running in parallel on the machine. The deployment of applications and tools on large-scale HPC computing systems requires the RTE to manage process creation in a scalable manner, support sparse connectivity, and provide fault tolerance. We have developed a new RTE that provides a basis for building distributed execution environments and developing tools for HPC to aid research in system software and resilience. This paper describes the software architecture of the Scalable runTime Component Infrastructure (STCI), which is intended to provide a complete infrastructure for scalable start-up and management of many processes in large-scale HPC systems. We highlight features of the current implementation, which is provided as a system library that allows developers to easily use and integrate STCI in their tools and/or applications. The motivation for this work has been to support ongoing research activities in fault-tolerance for large-scale systems. We discuss the advantages of the modular framework employed and describe two use cases that demonstrate its capabilities: (i) an alternate runtime for a Message Passing Interface (MPI) stack, and (ii) a distributed control and communication substrate for a fault-injection tool.

  4. What parameters to consider and which software tools to use for target selection and molecular design of small interfering RNAs.

    PubMed

    Matveeva, Olga

    2013-01-01

    The design of small gene silencing RNAs with a high probability of being efficient still has some elements of an art, especially when the lowest concentration of small molecules needs to be utilized. The design of highly target-specific small interfering RNAs or short hairpin RNAs is even a greater challenging task. Some logical schemes and software tools that can be used for simplifying both tasks are presented here. In addition, sequence motifs and sequence composition biases of small interfering RNAs that have to be avoided because of specificity concerns are also detailed. PMID:23027043

  5. APASVO: A free software tool for automatic P-phase picking and event detection in seismic traces

    NASA Astrophysics Data System (ADS)

    Romero, José Emilio; Titos, Manuel; Bueno, Ángel; Álvarez, Isaac; García, Luz; Torre, Ángel de la; Benítez, M.a. Carmen

    2016-05-01

    The accurate estimation of the arrival time of seismic waves or picking is a problem of major interest in seismic research given its relevance in many seismological applications, such as earthquake source location and active seismic tomography. In the last decades, several automatic picking methods have been proposed with the ultimate goal of implementing picking algorithms whose results are comparable to those obtained by manual picking. In order to facilitate the use of these automated methods in the analysis of seismic traces, this paper presents a new free, open source, software graphical tool, named APASVO, which allows picking tasks in an easy and user-friendly way. The tool also provides event detection functionality, where a relatively imprecise estimation of the onset time is sufficient. The application implements the STA-LTA detection algorithm and the AMPA picking algorithm. An autoregressive AIC-based picking method can also be applied. Besides, this graphical tool is complemented with two additional command line tools, an event picking tool and a synthetic earthquake generator. APASVO is a multiplatform tool that works on Windows, Linux and OS X. The application can process data in a large variety of file formats. It is implemented in Python and relies on well-known scientific computing packages such as ObsPy, NumPy, SciPy and Matplotlib.

  6. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  7. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  8. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.

    PubMed

    Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A

    2016-09-01

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. PMID:27344044

  9. RAVEN as a tool for dynamic probabilistic risk assessment: Software overview

    SciTech Connect

    Alfonsi, A.; Rabiti, C.; Mandelli, D.; Cogliati, J. J.; Kinoshita, R. A.

    2013-07-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/ monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities. (authors)

  10. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    SciTech Connect

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  11. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  12. The -mdoc macro package: A software tool to support computer documentation standards

    SciTech Connect

    Sanders, C.E.

    1987-09-16

    At Los Alamos National Laboratory a small staff of writers and word processors in the Computer Documentation Group is responsible for producing computer documentation for the over 8000 users of the Laboratory's computer network. The -mdoc macro package was developed as a software tool to support that effort. The -mdoc macro package is used with the NROFF/TROFF document preparation system on the UNIX operating system. The -mdoc macro package incorporates the standards for computer documentation at Los Alamos that were established by the writers. Use of the -mdoc macro package has freed the staff of programming format details, allowing writers to concentrate on content of documents and word processors to produce documents in a timely manner. It is an easy-to-use software tool that adapts to changing skills, needs, and technology. 5 refs.

  13. Quality assurance of solar spectral UV-measurements: methods and use of the SHICrivm software tool

    NASA Astrophysics Data System (ADS)

    Williams, J. E.; den Outer, P. N.; Slaper, H.

    2003-04-01

    Ground-based UV-irradiance measurements are crucial for determining the long-term changes and trends in biologically and/or photo-chemically relevant solar UV-radiation reaching the Earth's surface. Such changes in UV-radiation levels have probably occurred and/or are expected due to ozone depletion and climate change. In order to analyse UV-irradiation levels in relation to atmospheric parameters and to facilitate an assessment of the European UV-climate a European database (EUVDatabase) has been set up within the EDUCE-project (EC-contract EVK2-CT-1999-00028). High quality UV-data-sets from across the continent are assessable from the EUVDatabase (http://uv.fmi.fi/uvdb/). An accurate analysis of the UV-climate and long term changes therein requires quality assurance of the spectral data. The SHICrivm software tool (http://www.rivm.nl/shicrivm) is developed to analyse several quality aspects of measured UV-spectra. The SHICrivm tool is applied to over one million spectra from the EUVDatabase and detects for each measured spectrum: the accuracy of the wavelength calibration from 290 up to 500 nm, the lowest detectable irradiance level, the occurrence of non-natural spikes in spectra, deviations in spectral shape, and identifies possible irradiance scale errors in the UV-range. In addition the SHIC-package can be used to correct wavelength scale errors and non-natural spectral spikes. A deconvolution and convolution algorithm is included to improve the comparibility of spectra obtained with different instruments, and to allow a fully comparable analysis of biologically weighted UV-dose for instruments with various spectral characteristics. Within the context of the EDUCE-project data from over 20 UV-monitoring stations are retrieved from the database and a quality assessment is performed using the SHIC-tool. The quality parameters are presented by means of a simple scheme of coloured quality flags. Spectra that meet the WMO-criteria for spectral measurements are

  14. Project I-COP - architecture of software tool for decision support in oncology.

    PubMed

    Blaha, Milan; Janča, Dalibor; Klika, Petr; Mužík, Jan; Dušek, Ladislav

    2013-01-01

    This article briefly describes the development of the I-COP tool, which is designed to promote education and decision making of clinical oncologists. It is based on real data from medical facilities, which are processed, stored in database, analyzed and finally displayed in an interactive software application. Used data sources are shortly described in individual sections together with the functionality of developed tools. The final goal of this project is to provide support for work and education within each involved partner center. Clinical oncologists are therefore supposed to be the authors and users at the same time. PMID:23542983

  15. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    SciTech Connect

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  16. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  17. PlanetPack software tool for exoplanets detection: coming new features

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2014-07-01

    We briefly overview the new features of PlanetPack2, the forthcoming update of PlanetPack, which is a software tool for exoplanets detection and characterization from Doppler radial velocity data. Among other things, this major update brings parallelized computing, new advanced models of the Doppler noise, handling of the so-called Keplerian periodogram, and routines for transits fitting and transit timing variation analysis.

  18. Tools to aid the specification and design of flight software, appendix B

    NASA Technical Reports Server (NTRS)

    Bristow, G.

    1980-01-01

    The tasks that are normally performed during the specification and architecture design stages of software development are identified. Ways that tools could perform, or aid the performance, of such tasks are also identified. Much of the verification and analysis that is suggested is currently rarely performed during these early stages, but it is believed that this analysis should be done as early as possible so as to detect errors as early as possible.

  19. Techniques and tools for measuring energy efficiency of scientific software applications

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Gonçalo; Ou, Zhonghong; Khan, Kashif

    2015-05-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads.

  20. User Driven Development of Software Tools for Open Data Discovery and Exploration

    NASA Astrophysics Data System (ADS)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  1. gLAB-A Fully Software Tool to Generate, Process and Analyze GNSS Signals

    NASA Astrophysics Data System (ADS)

    Dionisio, Cesare; Citterico, Dario; Pirazzi, Gabriele; De Quattro, Nicola; Marracci, Riccardo; Cucchi, Luca; Valdambrini, Nicola; Formaioni, Irene

    2010-08-01

    In this paper the concept of Software Defined Radio (SDR) and its use in modern GNSS receiver is highlighted demonstrating how software receivers are important in many situations especially for verification and validation. After a brief introduction of gLab, a fully software high modular tool to generate, process and analyze current and future GNSS signals, the different software modules will be described. Demonstrating the wide range of uses concerning gLab, different practical example will be briefly overviewed: from the analysis of real data over the experimental GIOVE-B satellite, to the antenna group delay determination or the CN0 estimation under wide dynamic range etc.. gLab is the result of different projects leaded by Intecs in GNSS SW Radio: the signal generator is the result of the SWAN (Sistemi softWare per Applicazioni di Navigazione) project under Italian Space Agency (ASI) contract, the analyzer and the processing module have been developed for ESA to V&V the IOV (In Orbit Validation) Galileo Phase. In this case the GNSS SW RX works in parallel with Test User Receivers (TUR) in order to validate the Signal In Space (SiS). Is remarkable that gLab is the result of over three years of development and approximately one year of test and validation under ESA (European Space Agency) supervision.

  2. The anatomy of E-Learning tools: Does software usability influence learning outcomes?

    PubMed

    Van Nuland, Sonya E; Rogers, Kem A

    2016-07-01

    Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists. PMID:26671838

  3. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  4. Improvement of a free software tool for the assessment of sediment connectivity

    NASA Astrophysics Data System (ADS)

    Crema, Stefano; Lanni, Cristiano; Goldin, Beatrice; Marchi, Lorenzo; Cavalli, Marco

    2015-04-01

    Sediment connectivity expresses the degree of linkage that controls sediment fluxes throughout landscape, in particular between sediment sources and downstream areas. The assessment of sediment connectivity becomes a key issue when dealing with risk mitigation and priorities of intervention in the territory. In this work, the authors report the improvements made to an open source and stand-alone application (SedInConnect, http://www.sedalp.eu/download/tools.shtml), along with extensive applications to alpine catchments. SedInConnect calculates a sediment connectivity index as expressed in Cavalli et al. (2013); the software improvements consisted primarily in the introduction of the sink feature, i.e. areas that act as traps for sediment produced upstream (e.g., lakes, sediment traps). Based on user-defined sinks, the software decouples those parts of the catchment that do not deliver sediment to a selected target of interest (e.g., fan apex, main drainage network). In this way the assessment of sediment connectivity is achieved by taking in consideration effective sediment contributing areas. Sediment connectivity analysis has been carried out on several catchments in the South Tyrol alpine area (Northern Italy) with the goal of achieving a fast and objective characterization of the topographic control on sediment transfer. In addition to depicting the variability of sediment connectivity inside each basin, the index of connectivity has proved to be a valuable indicator of the dominant process characterizing the basin sediment dynamics (debris flow, bedload, mixed behavior). The characterization of the dominant process is of great importance for the hazard and risk assessment in mountain areas, and for choice and design of structural and non-structural intervention measures. The recognition of the dominant sediment transport process by the index of connectivity is in agreement with evidences arising from post-event field surveys and with the application of

  5. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  6. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    NASA Astrophysics Data System (ADS)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  7. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    PubMed

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data. PMID:27583365

  8. Allele Name Translation Tool and Update NomenCLature: software tools for the automated translation of HLA allele names between successive nomenclatures.

    PubMed

    Mack, S J; Hollenbach, J A

    2010-05-01

    In this brief communication, we describe the Allele Name Translation Tool (antt) and Update NomenCLature (uncl), free programs developed to facilitate the translation of human leukocyte antigen (HLA) allele names recorded using the December 2002 version of the HLA allele nomenclature (e.g. A*01010101) to those recorded using the colon-delimited version of the HLA allele nomenclature (e.g. A*01:01:01:01) that was adopted in April 2010. In addition, the antt and uncl translate specific HLA allele-name changes (e.g. DPB1*0502 is translated to DPB1*104:01), as well as changes to the locus prefix for HLA-C (i.e. Cw* is translated to C*). The antt and uncl will also translate allele names that have been truncated to two, four, or six digits, as well as ambiguous allele strings. The antt is a locally installed and run application, while uncl is a web-based tool that requires only an Internet connection and a modern browser. The antt accepts a variety of HLA data-presentation and allele-name formats. In addition, the antt can translate using user-defined conversion settings (e.g. the names of alleles that encode identical peptide binding domains can be translated to a common 'P-code'), and can serve as a preliminary data-sanity tool. The antt is available for download, and uncl for use, at www.igdawg.org/software. PMID:20412076

  9. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    SciTech Connect

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  10. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  11. A user friendly software tool for the simulation and optimization of high power fiber lasers

    NASA Astrophysics Data System (ADS)

    Shang, Liang; Mao, Qinghe

    2008-12-01

    Double-clad rare-earth-doped fiber laser is a new generation of high power solid-state lasers. The numerical simulation is an important approach in the configuration design and parameter optimization for the high power fiber lasers (HPFLs). In this paper, we report our user-friendly high-power fiber laser simulation software system, which integrates the design, analysis and optimization functions together. The numerical simulations of the software are with the HPFL model based on the rate-equation theory. By using the theoretical model, for specific laser cavity configuration, doped fiber parameters and pump conditions, the distributions of the population inversion, forward and backward pump, and circulating lasing intensity along the doped fiber can be calculated, and thus, the main output characteristics, such as cavity gain, output power and laser efficiency, can be achieved accordingly. On the basis of the simulation results, the software supplies the functions developed for designs and optimizations of the pump configuration, doped fiber length and the reflectivity of output mirror. By combining the calculated mode-field distribution in doped fibers with the mechanism of curvature loss to suppress the higher-order modes, the software also supplies the function for optimizing the beam quality. With graphical user interface (GUI), all the functions of the software are provided function tools in menu options, especially for those which may be used frequently, toolbar buttons, shortcut keys and pop-up menus are also provided. The software is with single-document interface (SDI) and coded in C++ in the integrated development environment of Visual C++ 6.0. We believe it would be very helpful for the investigation and development of HPFLs.

  12. Development of a software tool and criteria evaluation for efficient design of small interfering RNA.

    PubMed

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-01

    RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA. PMID:21145307

  13. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-01-01

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  14. Strategy Instruction in Early Childhood Math Software: Detecting and Teaching Single-Digit Addition Strategies

    ERIC Educational Resources Information Center

    Carpenter, Kara Kilmartin

    2013-01-01

    In early childhood mathematics, strategy-use is an important indicator of children's conceptual understanding and is a strong predictor of later math performance. Strategy instruction is common in many national curricula, yet is virtually absent from most math software. The current study describes the design of one software activity teaching…

  15. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    PubMed Central

    Teuber, Markus; Wenz, Michael H; Schreiber, Stefan; Franke, Andre

    2009-01-01

    Background Genotyping of single-nucleotide polymorphisms (SNPs) is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA) enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform. PMID:19267942

  16. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    PubMed Central

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM) occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC) assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers. PMID:21423806

  17. NVLabCAP: an NVESD-developed software tool to determine EO system performance

    NASA Astrophysics Data System (ADS)

    Burks, Stephen D.; Doe, Joshua M.; Haefner, David P.; Teaney, Brian P.

    2014-05-01

    Engineers at the US Army Night Vision and Electronic Sensors Directorate have recently developed a software package called NVLabCap. This software not only captures sequential frames from thermal and visible sensors, but it also can perform measurements of signal intensity transfer function, 3-dimensional noise, field of view, super-resolved modulation transfer function, and image bore sight. Additionally, this software package, along with a set of commonly known inputs for a given thermal imaging sensor, can be used to automatically create an NV-IPM element for that measured system. This model data can be used to determine if a sensor under test is within certain tolerances, and this model can be used to objectively quantify measured versus given system performance.

  18. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  19. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this

  20. Development and applications of a software tool for diarthrodial joint analysis.

    PubMed

    Martelli, Sandra; Lopomo, Nicola; Greggio, Samuele; Ferretti, Emil; Visani, Andrea

    2006-07-01

    This paper describes a new software environment for advanced analysis of diarthrodial joints. The new tool provides a number of elaboration functions to investigate the joint kinematics, bone anatomy, and ligament and tendon properties. In particular, the shapes and the contact points of the articulating surfaces can be displayed and analysed through 2D user-defined sections and fittings (lines or conics). Ligament behaviour can be evaluated during joint movement, through the computation of elongations, orientations, and fiber strain. Motion trajectories can be also analysed through the calculation of helical axes, instantaneous rotations, and displacements in specific user-chosen coordinate reference frames. The software has an user-friendly graphical interface to display four-dimensional data (time-space data) obtained from medical images, navigation systems, spatial linkages or digitalizers, and can also generate printable reports and multiple graphs as well as ASCII files that can be imported to spreadsheet programs such as Microsoft Excel. PMID:16777259

  1. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ). PMID:25426642

  2. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    PubMed

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems. PMID:22001398

  3. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  4. SPRECware: software tools for Standard PREanalytical Code (SPREC) labeling - effective exchange and search of stored biospecimens.

    PubMed

    Nanni, Umberto; Betsou, Fotini; Riondino, Silvia; Rossetti, Luisa; Spila, Antonella; Valente, Maria Giovanna; Della-Morte, David; Palmirotta, Raffaele; Roselli, Mario; Ferroni, Patrizia; Guadagni, Fiorella

    2012-01-01

    Biobanks provide stored material to basic, translational, and epidemiological research and this material should be transferred without institute-dependent intrinsic bias. The ISBER Biospecimen Science Working Group has released a "Standard PREanalytical Code" (SPREC), which is a proposal for a standard coding of the preanalytical options that have been adopted in order to track and make explicit the preanalytical variations in the collection, preparation, and storage of specimens. In this paper we address 2 issues arising in any biobank or biolaboratory aiming at adopting SPREC: (i) reducing the burden required to adopt this standard coding, and (ii) maximize the immediate benefits of this adoption by providing a free, dedicated software tool. We propose SPRECware, a vision encompassing tools and solutions for the best exploitation of SPREC based on information technology (www.sprecware.org). As a first step, we make available SPRECbase, a software tool useful for generating, storing, managing, and exchanging SPREC-related information associated to specimens. Adopting SPREC is useful both for internal purposes (such as finding the samples having some given preanalytical features), and for exchanging the preanalytical information associated to biological samples between Laboratory Information Systems. In case of a common adoption of this coding, it would be easy to find out whether and where, among the participating Biological Resource Centers, the specimens for a given study are available in order to carry out a planned experiment. PMID:23032579

  5. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  6. A software tool for removing patient identifying information from clinical documents.

    PubMed

    Friedlin, F Jeff; McDonald, Clement J

    2008-01-01

    We created a software tool that accurately removes all patient identifying information from various kinds of clinical data documents, including laboratory and narrative reports. We created the Medical De-identification System (MeDS), a software tool that de-identifies clinical documents, and performed 2 evaluations. Our first evaluation used 2,400 Health Level Seven (HL7) messages from 10 different HL7 message producers. After modifying the software based on the results of this first evaluation, we performed a second evaluation using 7,190 pathology report HL7 messages. We compared the results of MeDS de-identification process to a gold standard of human review to find identifying strings. For both evaluations, we calculated the number of successful scrubs, missed identifiers, and over-scrubs committed by MeDS and evaluated the readability and interpretability of the scrubbed messages. We categorized all missed identifiers into 3 groups: (1) complete HIPAA-specified identifiers, (2) HIPAA-specified identifier fragments, (3) non-HIPAA-specified identifiers (such as provider names and addresses). In the results of the first-pass evaluation, MeDS scrubbed 11,273 (99.06%) of the 11,380 HIPAA-specified identifiers and 38,095 (98.26%) of the 38,768 non-HIPAA-specified identifiers. In our second evaluation (status postmodification to the software), MeDS scrubbed 79,993 (99.47%) of the 80,418 HIPAA-specified identifiers and 12,689 (96.93%) of the 13,091 non-HIPAA-specified identifiers. Approximately 95% of scrubbed messages were both readable and interpretable. We conclude that MeDS successfully de-identified a wide range of medical documents from numerous sources and creates scrubbed reports that retain their interpretability, thereby maintaining their usefulness for research. PMID:18579831

  7. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP. PMID:21330730

  8. A software tool for STED-AFM correlative super-resolution microscopy

    NASA Astrophysics Data System (ADS)

    Koho, Sami; Deguchi, Takahiro; Löhmus, Madis; Näreoja, Tuomas; Hänninen, Pekka E.

    2015-03-01

    Multi-modal correlative microscopy allows combining the strengths of several imaging techniques to provide unique contrast. However it is not always straightforward to setup instruments for such customized experiments, as most microscope manufacturers use their own proprietary software, with limited or no capability to interface with other instruments - this makes correlation of the multi-modal data extremely challenging. We introduce a new software tool for simultaneous use of a STimulated Emission Depletion (STED) microscope with an Atomic Force Microscope (AFM). In our experiments, a Leica TCS STED commercial super-resolution microscope, together with an Agilent 5500ilm AFM microscope was used. With our software, it is possible to synchronize the data acquisition between the STED and AFM instruments, as well as to perform automatic registration of the AFM images with the super-resolution STED images. The software was realized in LabVIEW; the registration part was also implemented as an ImageJ script. The synchronization was realized by controlling simple trigger signals, also available in the commercial STED microscope, with a low-cost National Instruments USB-6501 digital I/O card. The registration was based on detecting the positions of the AFM tip inside the STED fieldof-view, which were then used as registration landmarks. The registration should work on any STED and tip-scanning AFM microscope combination, at nanometer-scale precision. Our STED-AFM correlation method has been tested with a variety of nanoparticle and fixed cell samples. The software will be released under BSD open-source license.

  9. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  10. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    ERIC Educational Resources Information Center

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  11. Verification of visual odometry algorithms with an OpenGL-based software tool

    NASA Astrophysics Data System (ADS)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  12. The impact of software and CAE tools on SEU in field programmable gate arrays

    SciTech Connect

    Katz, R.; Wang, J.; McCollum, J.; Cronquist, B.

    1999-12-01

    Field programmable gate array (FPGA) devices, heavily used in spacecraft electronics, have grown substantially in size over the past few years, causing designers to work at a higher conceptual level, with computer aided engineering (CAE) tools synthesizing and optimizing the logic from a description. It is shown that the use of commercial-off-the-shelf (COTS) CAE tools can produce unreliable circuit designs when the device is used in a radiation environment and a flip-flop is upset. At a lower level, software can be used to improve the SEU performance of a flip-flop, exploiting the configurable nature of FPGA technology and on-chip delay, parasitic resistive, and capacitive circuit elements.

  13. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    NASA Astrophysics Data System (ADS)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  14. Development of a software tool and criteria evaluation for efficient design of small interfering RNA

    SciTech Connect

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-07

    Research highlights: {yields} The developed tool predicted siRNA constructs with better thermodynamic stability and total score based on positional and other criteria. {yields} Off-target silencing below score 30 were observed for the best siRNA constructs for different genes. {yields} Immunostimulation and cytotoxicity motifs considered and penalized in the developed tool. {yields} Both positional and compositional criteria were observed to be important. -- Abstract: RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA.

  15. A software tool for material data analysis and property prediction: CASAC-ANA

    SciTech Connect

    Zhou, J.; Xie, Q.; Feng, J.; Li, S.; Xu, Z.; Chen, L.; Gui, Z.

    1995-12-31

    In this paper, a user-friendly software, CASAC-ANA, for material data analysis and property prediction is presented. In CASAC-ANA, there are seven methods: Nonlinear Mapping (NLM), Principal Component Analysis (PCA), Stepwise Discriminant Analysis (SDA), Discriminant Analysis with Constellation Graph (DACG), Hierarchical Clustering Analysis (HCA), Stepwise Multiple Linear Regression (SMLR), and Artificial Neural Networks (ANN). The software has some noteworthy features: (1) only one input file is needed and multipath output is produced; (2) both quantitative and qualitative data of dependent variables are accepted; and (3) it is easy to link with materials property databases. As a generalized modeling tool, CASAC-ANA can be used to treat material data concerning composition, technological processes, properties, and to predict properties of materials. The validity of the CASAC-ANA software has been tested successfully with three typical case studies concerning structural alloy steels, nickel-base superalloys, and continuously cast copper alloys. These CASAC-ANA methods have been compared and discussed.

  16. TaxI: a software tool for DNA barcoding using distance methods

    PubMed Central

    Steinke, Dirk; Vences, Miguel; Salzburger, Walter; Meyer, Axel

    2005-01-01

    DNA barcoding is a promising approach to the diagnosis of biological diversity in which DNA sequences serve as the primary key for information retrieval. Most existing software for evolutionary analysis of DNA sequences was designed for phylogenetic analyses and, hence, those algorithms do not offer appropriate solutions for the rapid, but precise analyses needed for DNA barcoding, and are also unable to process the often large comparative datasets. We developed a flexible software tool for DNA taxonomy, named TaxI. This program calculates sequence divergences between a query sequence (taxon to be barcoded) and each sequence of a dataset of reference sequences defined by the user. Because the analysis is based on separate pairwise alignments this software is also able to work with sequences characterized by multiple insertions and deletions that are difficult to align in large sequence sets (i.e. thousands of sequences) by multiple alignment algorithms because of computational restrictions. Here, we demonstrate the utility of this approach with two datasets of fish larvae and juveniles from Lake Constance and juvenile land snails under different models of sequence evolution. Sets of ribosomal 16S rRNA sequences, characterized by multiple indels, performed as good as or better than cox1 sequence sets in assigning sequences to species, demonstrating the suitability of rRNA genes for DNA barcoding. PMID:16214755

  17. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  18. Acts -- A collection of high performing software tools for scientific computing

    SciTech Connect

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  19. Data Analysis Software Tools for Enhanced Collaboration at the DIII-D National Fusion Facility

    SciTech Connect

    Schachter, J.; Peng, Q.; Schissel, D.P.

    1999-07-01

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server.

  20. ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance

    PubMed Central

    Hng, Keng Imm; Dormann, Dirk

    2013-01-01

    Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017

  1. Software tools for quantification of X-ray microtomography at the UGCT

    NASA Astrophysics Data System (ADS)

    Vlassenbroeck, J.; Dierick, M.; Masschaele, B.; Cnudde, V.; Van Hoorebeke, L.; Jacobs, P.

    2007-09-01

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab ®. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  2. DSSR: an integrated software tool for dissecting the spatial structure of RNA

    PubMed Central

    Lu, Xiang-Jun; Bussemaker, Harmen J.; Olson, Wilma K.

    2015-01-01

    Insight into the three-dimensional architecture of RNA is essential for understanding its cellular functions. However, even the classic transfer RNA structure contains features that are overlooked by existing bioinformatics tools. Here we present DSSR (Dissecting the Spatial Structure of RNA), an integrated and automated tool for analyzing and annotating RNA tertiary structures. The software identifies canonical and noncanonical base pairs, including those with modified nucleotides, in any tautomeric or protonation state. DSSR detects higher-order coplanar base associations, termed multiplets. It finds arrays of stacked pairs, classifies them by base-pair identity and backbone connectivity, and distinguishes a stem of covalently connected canonical pairs from a helix of stacked pairs of arbitrary type/linkage. DSSR identifies coaxial stacking of multiple stems within a single helix and lists isolated canonical pairs that lie outside of a stem. The program characterizes ‘closed’ loops of various types (hairpin, bulge, internal, and junction loops) and pseudoknots of arbitrary complexity. Notably, DSSR employs isolated pairs and the ends of stems, whether pseudoknotted or not, to define junction loops. This new, inclusive definition provides a novel perspective on the spatial organization of RNA. Tests on all nucleic acid structures in the Protein Data Bank confirm the efficiency and robustness of the software, and applications to representative RNA molecules illustrate its unique features. DSSR and related materials are freely available at http://x3dna.org/. PMID:26184874

  3. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    SciTech Connect

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  4. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    PubMed Central

    2011-01-01

    new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface) documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html PMID:21414234

  5. Software for Information Storage and Retrieval Tested, Evaluated and Compared: Part VI--Various Additional Programs.

    ERIC Educational Resources Information Center

    Sieverts, Eric G.; And Others

    1993-01-01

    Reports on tests evaluating nine microcomputer software packages designed for information storage and retrieval: BRS-Search, dtSearch, InfoBank, Micro-OPC, Q&A, STN-PFS, Strix, TINman, and ZYindex. Tables and narrative evaluations detail results related to security, hardware, user features, search capability, indexing, input, maintenance of files,…

  6. Proofreading Using an Assistive Software Homophone Tool: Compensatory and Remedial Effects on the Literacy Skills of Students with Reading Difficulties

    ERIC Educational Resources Information Center

    Lange, Alissa A.; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones…

  7. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  8. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  9. Establishing a Web-based DICOM teaching file authoring tool using open-source public software.

    PubMed

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-09-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our server is built using Apache Web server running on FreeBSD operating system. The dynamic page content is produced by Hypertext Preprocessor (PHP). Digital Imaging and Communications in Medicine images are converted by ImageMagick into Joint Photographic Experts Group (JPEG) format. Digital Imaging and Communications in Medicine attributes are parsed by dicom3tools and stored in PostgreSQL database. Using free software available from the Internet, we build a Web service that allows radiologists to create their own online teaching file cases with a common Web browser. PMID:15924271

  10. A software tool for automatic classification and segmentation of 2D/3D medical images

    NASA Astrophysics Data System (ADS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-02-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  11. A browsing tool for the Internet Logical Library of the HPCC Software Exchange

    NASA Technical Reports Server (NTRS)

    Biro, Ross

    1993-01-01

    As the quantity of information available on the Internet grows, locating a particular piece of information becomes more difficult. One possible solution is for a database of pointers to all available information to be maintained at a central site. Subject classifications for all the information could also be maintained in order to make searching possible. This paper describes one possible method of searching such an index. In particular a prototype browsing tool has been created using TCL/TK to demonstrate several possible features: rapidly scanning at any rank of the index, narrowing the index to any scope, regular-expression searching, and creation of a list of pointers answering to any set of index terms. The prototype browser is an easy-to-use independent X application designed for use in the Catalog of Repositories of the HPCC (High Performance Computing and Communications) Software Exchange.

  12. Thermonuclear Reaction Rate Libraries and Software Tools for Nuclear Astrophysics Research

    NASA Astrophysics Data System (ADS)

    Smith, Michael S.; Cyburt, Richard; Schatz, Hendrik; Wiescher, Michael; Smith, Karl; Warren, Scott; Ferguson, Ryan; Lingerfelt, Eric; Buckner, Kim; Nesaraja, Caroline D.

    2008-05-01

    Thermonuclear reaction rates are a crucial input for simulating a wide variety of astrophysical environments. A new collaboration has been formed to ensure that astrophysical modelers have access to reaction rates based on the most recent experimental and theoretical nuclear physics information. To reach this goal, a new version of the REACLIB library has been created by the Joint Institute for Nuclear Astrophysics (JINA), now available online at http://www.nscl.msu.edu/~nero/db. A complementary effort is the development of software tools in the Computational Infrastructure for Nuclear Astrophysics, online at nucastrodata.org, to streamline, manage, and access the workflow of the reaction evaluations from their initiation to peer review to incorporation into the library. Details of these new projects will be described.

  13. BioBrick assembly standards and techniques and associated software tools.

    PubMed

    Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi

    2014-01-01

    The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database. PMID:24395353

  14. A Software Tool for Processing the Displacement Time Series Extracted from Raw Radar Data

    SciTech Connect

    Coppi, Francesco; Paolo Ricci, Pier; Gentile, Carmelo

    2010-05-28

    The application of high-resolution radar waveform and interferometric principles recently led to the development of a microwave interferometer, suitable to simultaneously measuring the (static or dynamic) deflection of several points on a large structure. From the technical standpoint, the sensor is a Stepped Frequency Continuous Wave (SF-CW), coherent radar, operating in the K{sub u} frequency band.In the paper, the main procedures adopted to extract the deflection time series from raw radar data and to assess the quality of data are addressed, and the MATLAB toolbox developed is described. Subsequently, other functions implemented in the software tool (e.g. evaluation of the spectral matrix of the deflection time-histories, identification of natural frequencies and operational mode shapes evaluation) are described and the application to data recorded on full-scale bridges is exemplified.

  15. GeneMarker® Genotyping Software: Tools to Increase the Statistical Power of DNA Fragment Analysis

    PubMed Central

    Hulce, D.; Li, X.; Snyder-Leiby, T.; Johathan Liu, C.S.

    2011-01-01

    The discriminatory power of post-genotyping analyses, such as kinship or clustering analysis, is dependent on the amount of genetic information obtained from the DNA fragment/genotyping analysis. The number of microsatellite loci amplified in one multiplex is limited by the number of dyes and overlapping loci boundaries; requiring researchers to amplify replicate samples with 2 or more multiplexes in order to obtain a genotype for 12–15 loci. AFLP is another method that is limited by the number of dyes, often requiring multiple amplifications of replicate samples to obtain more complete results. Traditionally, researchers export the genotyping results into a spread sheet, manually combine the results for each individual and then import into a third software package for post-genotyping analysis. GeneMarker is highly accurate, user-friendly genotyping software that allows all of these steps to be done in one software package, avoiding potential errors from data transfer to different programs and decreasing the amount of time needed to process the results. The Merge Project tool automatically combines the results from replicate samples processed with different primer sets. Replicate animal (diploid) DNA samples were amplified with three different multiplexes, each multiplex provided information on 4–6 loci. The kinship analysis using the merged results provided a 1017 increase in statistical power with a range of 108 when 5 loci were used versus 1025 when 15 loci were used to determine potential relationship levels with identity by descent calculations. These same sample sets were used in clustering analysis to diagram dendrograms. The dendrogram based on a single multiplex resulted in three branches at a given Euclidian distance. In comparison, the dendrogram that was constructed using the merged results had eight branches at the same Euclidian distance.

  16. RadNotes: a novel software development tool for radiology education.

    PubMed

    Baxter, A B; Klein, J S; Oesterle, E V

    1997-01-01

    RadNotes is a novel software development tool that enables physicians to develop teaching materials incorporating text and images in an intelligent, highly usable format. Projects undertaken in the RadNotes environment require neither programming expertise nor the assistance of a software engineer. The first of these projects, Thoracic Imaging, integrates image teaching files, concise disease and topic summaries, references, and flash card quizzes into a single program designed to provide an overview of chest radiology. RadNotes is intended to support the academic goals of teaching radiologists by enabling authors to create, edit, and electronically distribute image-oriented presentations. RadNotes also supports the educational goals of physicians who wish to quickly review selected imaging topics, as well as to develop a visual vocabulary of corresponding radiologic anatomy and pathologic conditions. Although Thoracic Imaging was developed with the aim of introducing chest radiology to residents, RadNotes can be used to develop tutorials and image-based tests for all levels; create corresponding World Wide Web sites; and organize notes, images, and references for individual use. PMID:9153710

  17. OligoSpawn: a software tool for the design of overgo probes from large unigene datasets

    PubMed Central

    Zheng, Jie; Svensson, Jan T; Madishetty, Kavitha; Close, Timothy J; Jiang, Tao; Lonardi, Stefano

    2006-01-01

    Background Expressed sequence tag (EST) datasets represent perhaps the largest collection of genetic information. ESTs can be exploited in a variety of biological experiments and analysis. Here we are interested in the design of overlapping oligonucleotide (overgo) probes from large unigene (EST-contigs) datasets. Results OLIGOSPAWN is a suite of software tools that offers two complementary services, namely (1) the selection of "unique" oligos each of which appears in one unigene but does not occur (exactly or approximately) in any other and (2) the selection of "popular" oligos each of which occurs (exactly or approximately) in as many unigenes as possible. In this paper, we describe the functionalities of OLIGOSPAWN and the computational methods it employs, and we report on experimental results for the overgo probes designed with it. Conclusion The algorithms we designed are highly efficient and capable of processing unigene datasets of sizes on the order of several tens of Mb in a few hours on a regular PC. The software has been used to design overgo probes employed to screen a barley BAC library (Hordeum vulgare). OLIGOSPAWN is freely available at . PMID:16401345

  18. Open Source Software Openfoam as a New Aerodynamical Simulation Tool for Rocket-Borne Measurements

    NASA Astrophysics Data System (ADS)

    Staszak, T.; Brede, M.; Strelnikov, B.

    2015-09-01

    The only way to do in-situ measurements, which are very important experimental studies for atmospheric science, in the mesoshere/lower thermosphere (MLT) is to use sounding rockets. The drawback of using rockets is the shock wave appearing because of the very high speed of the rocket motion (typically about 1000 mIs). This shock wave disturbs the density, the temperature and the velocity fields in the vicinity of the rocket, compared to undisturbed values of the atmosphere. This effect, however, can be quantified and the measured data has to be corrected not just to make it more precise but simply usable. The commonly accepted and widely used tool for this calculations is the Direct Simulation Monte Carlo (DSMC) technique developed by GA. Bird which is available as stand-alone program limited to use a single processor. Apart from complications with simulations of flows around bodies related to different flow regimes in the altitude range of MLT, that rise due to exponential density change by several orders of magnitude, a particular hardware configuration introduces significant difficulty for aerodynamical calculations due to choice of the grid sizes mainly depending on the demands on adequate DSMCs and good resolution of geometries with scale differences of factor of iO~. This makes either the calculation time unreasonably long or even prevents the calculation algorithm from converging. In this paper we apply the free open source software OpenFOAM (licensed under GNU GPL) for a three-dimensional CFD-Simulation of a flow around a sounding rocket instrumentation. An advantage of this software package, among other things, is that it can run on high performance clusters, which are easily scalable. We present the first results and discuss the potential of the new tool in applications for sounding rockets.

  19. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  20. Freeform manufacturing of a progressive addition lens by use of a voice coil fast tool servo

    NASA Astrophysics Data System (ADS)

    Li, Yi Yu; Chen, Jiao Jie; Feng, Hai Hua; Li, Chaohong; Qu, Jia; Chen, Hao

    2014-08-01

    The back surface of progressive addition lens (PAL) is a non-rotationally symmetric freeform surface. The local radius varies progressively from the far zone to the near zone along the intermediate zone to give the addition power. Numerical simulation method is performed to calculate the discrete points on the freeform surface in polar coordinate and generate the data files containing the trajectory of diamond tool tip for surface machining. The fabrication of PAL is accomplished by using self-developed single-point diamond turning machine with voice coil fast tool servo. The polished freeform surface profile measured by a 3-axes coordinate measuring machine shows little deviation to the simulation result. Surface power and cylinder of the fabricated PAL is also measured for comparison with theoretical design.

  1. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    NASA Astrophysics Data System (ADS)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  2. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    NASA Technical Reports Server (NTRS)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  3. Effectiveness of Crown Preparation Assessment Software As an Educational Tool in Simulation Clinic: A Pilot Study.

    PubMed

    Tiu, Janine; Cheng, Enxin; Hung, Tzu-Chiao; Yu, Chuan-Chia; Lin, Tony; Schwass, Don; Al-Amleh, Basil

    2016-08-01

    The aim of this pilot study was to evaluate the feasibility of a new tooth preparation assessment software, Preppr, as an educational tool for dental students in achieving optimal parameters for a crown preparation. In February 2015, 30 dental students in their fourth year in a five-year undergraduate dental curriculum in New Zealand were randomly selected from a pool of volunteers (N=40) out of the total class of 85. The participants were placed into one of three groups of ten students each: Group A, the control group, received only written and pictorial instructions; Group B received tutor evaluation and feedback; and Group C performed self-directed learning with the aid of Preppr. Each student was asked to prepare an all-ceramic crown on the lower first molar typodont within three hours and to repeat the exercise three times over the next four weeks. The exercise stipulated a 1 mm finish line dimension and total convergence angles (TOC) between 10 and 20 degrees. Fulfillment of these parameters was taken as an acceptable preparation. The results showed that Group C had the highest percentage of students who achieved minimum finish line dimensions and acceptable TOC angles. Those students also achieved the stipulated requirements earlier than the other groups. This study's findings provide promising data on the feasibility of using Preppr as a self-directed educational tool for students training to prepare dental crowns. PMID:27480712

  4. TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.

    2012-12-01

    A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time

  5. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    PubMed

    Robiou-du-Pont, Sébastien; Li, Aihua; Christie, Shanice; Sohani, Zahra N; Meyre, David

    2015-01-01

    Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs) associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY) birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37). With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%). With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file). The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes). In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation. PMID:25742008

  6. Should We Have Blind Faith in Bioinformatics Software? Illustrations from the SNAP Web-Based Tool

    PubMed Central

    Robiou-du-Pont, Sébastien; Li, Aihua; Christie, Shanice; Sohani, Zahra N.; Meyre, David

    2015-01-01

    Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs) associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY) birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37). With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%). With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any ‘false positive’ SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file). The Cohen’s Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes). In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation. PMID:25742008

  7. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  8. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  9. Use of slide presentation software as a tool to measure hip arthroplasty wear.

    PubMed

    Yun, Ho Hyun; Jajodia, Nirmal K; Myung, Jae Sung; Oh, Jong Keon; Park, Sang Won; Shon, Won Yong

    2009-12-01

    The authors propose a manual measurement method for wear in total hip arthroplasty (PowerPoint method) based on the well-known Microsoft PowerPoint software (Microsoft Corporation, Redmond, Wash). In addition, the accuracy and reproducibility of the devised method were quantified and compared with two methods previously described by Livermore and Dorr, and accuracies were determined at different degrees of wear. The 57 hips recruited were allocated to: class 1 (retrieval series), class 2 (clinical series), and class 3 (a repeat film analysis series). The PowerPoint method was found to have good reproducibility and to better detect wear differences between classes. The devised method can be easily used for recording wear at follow-up visits and could be used as a supplementary method when computerized methods cannot be employed. PMID:19896061

  10. Development and evaluation of an open source software tool for deidentification of pathology reports

    PubMed Central

    Beckwith, Bruce A; Mahaadevan, Rajeshwarri; Balis, Ulysses J; Kuo, Frank

    2006-01-01

    Background Electronic medical records, including pathology reports, are often used for research purposes. Currently, there are few programs freely available to remove identifiers while leaving the remainder of the pathology report text intact. Our goal was to produce an open source, Health Insurance Portability and Accountability Act (HIPAA) compliant, deidentification tool tailored for pathology reports. We designed a three-step process for removing potential identifiers. The first step is to look for identifiers known to be associated with the patient, such as name, medical record number, pathology accession number, etc. Next, a series of pattern matches look for predictable patterns likely to represent identifying data; such as dates, accession numbers and addresses as well as patient, institution and physician names. Finally, individual words are compared with a database of proper names and geographic locations. Pathology reports from three institutions were used to design and test the algorithms. The software was improved iteratively on training sets until it exhibited good performance. 1800 new pathology reports were then processed. Each report was reviewed manually before and after deidentification to catalog all identifiers and note those that were not removed. Results 1254 (69.7 %) of 1800 pathology reports contained identifiers in the body of the report. 3439 (98.3%) of 3499 unique identifiers in the test set were removed. Only 19 HIPAA-specified identifiers (mainly consult accession numbers and misspelled names) were missed. Of 41 non-HIPAA identifiers missed, the majority were partial institutional addresses and ages. Outside consultation case reports typically contain numerous identifiers and were the most challenging to deidentify comprehensively. There was variation in performance among reports from the three institutions, highlighting the need for site-specific customization, which is easily accomplished with our tool. Conclusion We have

  11. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    ERIC Educational Resources Information Center

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  12. Students' Learning Experiences When Using a Dynamic Geometry Software Tool in a Geometry Lesson at Secondary School in Ethiopia

    ERIC Educational Resources Information Center

    Denbel, Dejene Girma

    2015-01-01

    Students learning experiences were investigated in geometry lesson when using Dynamic Geometry Software (DGS) tool in geometry learning in 25 Ethiopian secondary students. The research data were drawn from the used worksheets, classroom observations, results of pre- and post-test, a questionnaire and interview responses. I used GeoGebra as a DGS…

  13. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    EPA Science Inventory

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  14. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    ERIC Educational Resources Information Center

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  15. A Quantitative Study of a Software Tool that Supports a Part-Complete Solution Method on Learning Outcomes

    ERIC Educational Resources Information Center

    Garner, Stuart

    2009-01-01

    This paper reports on the findings from a quantitative research study into the use of a software tool that was built to support a part-complete solution method (PCSM) for the learning of computer programming. The use of part-complete solutions to programming problems is one of the methods that can be used to reduce the cognitive load that students…

  16. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    NASA Technical Reports Server (NTRS)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  17. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    ERIC Educational Resources Information Center

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  18. Prognostic 2.0: software tool for heart rate variability analysis and QT interval dispersion

    NASA Astrophysics Data System (ADS)

    Mendoza, Alfonso; Rueda, Oscar L.; Bautista, Lola X.; Martinez, Víctor E.; Lopez, Eddie R.; Gomez, Mario F.; Alvarez, Alexander

    2007-09-01

    Cardiovascular diseases, in particular Acute Myocardial Infarction (AMI) are the first cause of death in industrialized countries. Measurements of indicators of the behavior of the autonomic nervous system, such as the Heart Rate Variability (HRV) and the QT Interval Dispersion (QTD) in the acute phase of the AMI (first 48 hours after the event) give a good estimation of the subsequent cardiac events that could present a person who had suffered an AMI. This paper describes the implementation of the second version of Prognostic-AMI, a software tool that automate the calculation of such indicators. It uses the Discrete Wavelet Transform (DWT) to de-noise the signals an to detect the QRS complex and the T-wave from a conventional electrocardiogram of 12 leads. Indicators are measured in both time and frequency domain. A pilot trial performed on a sample population of 76 patients shows that people who had had cardiac complications in the acute phase of the AMI have low values in the indicators of HRV and QTD.

  19. STEAM: a software tool based on empirical analysis for micro electro mechanical systems

    NASA Astrophysics Data System (ADS)

    Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat

    2006-03-01

    In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.

  20. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  1. PlanetPack3: a software tool for exoplanets characterization from radial velocity and transit data

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2015-08-01

    We describe the forthcoming third major release of the PlanetPack software tool for exoplanets detection and characterization from Doppler and/or transit data. Among other things, this major update will bring routines for the joint fitting of radial velocities and transits, optionally taking into account various subtle effects: the Rossiter-McLaughlin effect, the light arrival time delay between the radial velocity and transit curves, new experimental models of the Doppler or photometry noise, including non-stationary models with variable noise magnitude (due to e.g. the stellar activity variations).This work was supported by the Russian Foundation for Basic Research (project No. 14-02-92615 KO_a), the UK Royal Society International Exchange grant IE140055, by the President of Russia grant for young scientists (No. MK-733.2014.2), by the programme of the Presidium of Russian Academy of Sciences P21, and by the Saint Petersburg State University research grant 6.37.341.2015.

  2. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    NASA Astrophysics Data System (ADS)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  3. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    PubMed

    DeJesus, Michael A; Ambadipudi, Chaitra; Baker, Richard; Sassetti, Christopher; Ioerger, Thomas R

    2015-10-01

    TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit. PMID:26447887

  4. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  5. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    NASA Astrophysics Data System (ADS)

    Abtahi, F.; Gyllensten, I. C.; Lindecrantz, K.; Seoane, F.

    2012-12-01

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  6. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    PubMed

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software. PMID:26846288

  7. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    NASA Astrophysics Data System (ADS)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such

  8. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    SciTech Connect

    Sterling, T.; Messina, P.; Chen, M.

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  9. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    NASA Astrophysics Data System (ADS)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  10. YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities

    PubMed Central

    Schwarz, Roland; Musch, Patrick; von Kamp, Axel; Engels, Bernd; Schirmer, Heiner; Schuster, Stefan; Dandekar, Thomas

    2005-01-01

    Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA) has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL), edit (including support for the SBML format), visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM) activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA) for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints) in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at . Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user-friendly software for such efforts. PMID:15929789

  11. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    PubMed Central

    Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.

    2011-01-01

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations. PMID:21572957

  12. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  13. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  14. Evaluation of Exogenous siRNA Addition as a Metabolic Engineering Tool for Modifying Biopharmaceuticals

    PubMed Central

    Tummala, Seshu; Titus, Michael; Wilson, Lee; Wang, Chunhua; Ciatto, Carlo; Foster, Donald; Szabo, Zoltan; Guttman, Andras; Li, Chen; Bettencourt, Brian; Jayaraman, Muthuswamy; Deroot, Jack; Thill, Greg; Kocisko, David; Pollard, Stuart; Charisse, Klaus; Kuchimanchi, Satya; Hinkle, Greg; Milstein, Stuart; Myers, Rachel; Wu, Shiaw-Lin; Karger, Barry; Rossomando, Anthony

    2012-01-01

    Traditional metabolic engineering approaches, including homologous recombination, zinc finger nucleases, and short hairpin RNA (shRNA), have previously been employed to generate biologics with specific characteristics that improve efficacy, potency, and safety. An alternative approach is to exogenously add soluble small interfering RNA (siRNA) duplexes, formulated with a cationic lipid, directly to cells grown in shake flasks or bioreactors, This approach has the following potential advantages : no cell line development required, ability to tailor mRNA silencing by adjusting siRNA concentration, simultaneous silencing of multiple target genes, and potential temporal control of down regulation of target gene expression. In this study, we demonstrate proof of concept of the siRNA feeding approach as a metabolic engineering tool in the context of increasing monoclonal antibody afucosylation. First, potent siRNA duplexes targeting fut8 and gmds were dosed into shake flasks with cells that express an anti-CD20 monoclonal antibody. Dose response studies demonstrated the ability to titrate the silencing effect. Furthermore, siRNA addition resulted in no deleterious effects on cell growth, final protein titer, or specific productivity. In bioreactors, antibodies produced by cells following siRNA treatment exhibited improved functional characteristics compared to antibodies from untreated cells, including increased levels of afucosylation (63%), a 17-fold improvement in FCgRIIIa binding, and an increase in specific cell lysis by up to 30%, as determined in an ADCC assay. In addition, standard purification procedures effectively cleared the exogenously added siRNA and transfection agent. Moreover, no differences were observed when other key product quality structural attributes were compared to untreated controls. These results establish that exogenous addition of siRNA represents a potentially novel metabolic engineering tool to improve biopharmaceutical function and

  15. GEnomes Management Application (GEM.app): a new software tool for large-scale collaborative genome analysis.

    PubMed

    Gonzalez, Michael A; Lebrigio, Rafael F Acosta; Van Booven, Derek; Ulloa, Rick H; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schüle, Rebecca; Züchner, Stephan

    2013-06-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges, we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ∼1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for nonbioinformaticians to make next-generation sequencing data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 sec across ∼1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. PMID:23463597

  16. Detecting variants with Metabolic Design, a new software tool to design probes for explorative functional DNA microarray development

    PubMed Central

    2010-01-01

    Background Microorganisms display vast diversity, and each one has its own set of genes, cell components and metabolic reactions. To assess their huge unexploited metabolic potential in different ecosystems, we need high throughput tools, such as functional microarrays, that allow the simultaneous analysis of thousands of genes. However, most classical functional microarrays use specific probes that monitor only known sequences, and so fail to cover the full microbial gene diversity present in complex environments. We have thus developed an algorithm, implemented in the user-friendly program Metabolic Design, to design efficient explorative probes. Results First we have validated our approach by studying eight enzymes involved in the degradation of polycyclic aromatic hydrocarbons from the model strain Sphingomonas paucimobilis sp. EPA505 using a designed microarray of 8,048 probes. As expected, microarray assays identified the targeted set of genes induced during biodegradation kinetics experiments with various pollutants. We have then confirmed the identity of these new genes by sequencing, and corroborated the quantitative discrimination of our microarray by quantitative real-time PCR. Finally, we have assessed metabolic capacities of microbial communities in soil contaminated with aromatic hydrocarbons. Results show that our probe design (sensitivity and explorative quality) can be used to study a complex environment efficiently. Conclusions We successfully use our microarray to detect gene expression encoding enzymes involved in polycyclic aromatic hydrocarbon degradation for the model strain. In addition, DNA microarray experiments performed on soil polluted by organic pollutants without prior sequence assumptions demonstrate high specificity and sensitivity for gene detection. Metabolic Design is thus a powerful, efficient tool that can be used to design explorative probes and monitor metabolic pathways in complex environments, and it may also be used to

  17. A flexible, interactive software tool for fitting the parameters of neuronal models.

    PubMed

    Friedrich, Péter; Vella, Michael; Gulyás, Attila I; Freund, Tamás F; Káli, Szabolcs

    2014-01-01

    The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID

  18. A flexible, interactive software tool for fitting the parameters of neuronal models

    PubMed Central

    Friedrich, Péter; Vella, Michael; Gulyás, Attila I.; Freund, Tamás F.; Káli, Szabolcs

    2014-01-01

    The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID

  19. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    PubMed

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  20. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming

    PubMed Central

    Rosenberg, Michael; Lay, Brendan S.; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS), during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART), to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months). During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01) than the sidestep (r = 0.87, p < .01), although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01) and moderate reliability for sidestep (r = 0.6983, p < .01) during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results. PMID:27442437

  1. Biomedical Mutation Analysis (BMA): A software tool for analyzing mutations associated with antiviral resistance

    PubMed Central

    Salvatierra, Karina; Florez, Hector

    2016-01-01

    Introduction: Hepatitis C virus (HCV) is considered a major public health problem, with 200 million people infected worldwide. The treatment for HCV chronic infection with pegylated interferon alpha plus ribavirin inhibitors is unspecific; consequently, the treatment is effective in only 50% of patients infected. This has prompted the development of direct-acting antivirals (DAA) that target virus proteins. These DAA have demonstrated a potent effect in vitro and in vivo; however, virus mutations associated with the development of resistance have been described. Objective: To design and develop an online information system for detecting mutations in amino acids known to be implicated in resistance to DAA. Materials and methods:    We have used computer applications, technological tools, standard languages, infrastructure systems and algorithms, to analyze positions associated with resistance to DAA for the NS3, NS5A, and NS5B genes of HCV. Results: We have designed and developed an online information system named Biomedical Mutation Analysis (BMA), which allows users to calculate changes in nucleotide and amino acid sequences for each selected sequence from conventional Sanger and cloning sequencing using a graphical interface. Conclusion: BMA quickly, easily and effectively analyzes mutations, including complete documentation and examples. Furthermore, the development of different visualization techniques allows proper interpretation and understanding of the results. The data obtained using BMA will be useful for the assessment and surveillance of HCV resistance to new antivirals, and for the treatment regimens by selecting those DAA to which the virus is not resistant, avoiding unnecessary treatment failures. The software is available at: http://bma.itiud.org. PMID:27547378

  2. Tools to Support the Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin

    2011-01-01

    The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (SRWG) is chartered with the investigation, production, and dissemination of information related to the reuse of NASA Earth science software assets. One major current objective is to engage the NASA decadal missions in areas relevant to software reuse. In this paper we report on the current status of these activities. First, we provide some background on the SRWG in general and then discuss the group s flagship recommendation, the NASA Reuse Readiness Levels (RRLs). We continue by describing areas in which mission software may be reused in the context of NASA decadal missions. We conclude the paper with pointers to future directions.

  3. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    NASA Technical Reports Server (NTRS)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  4. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  5. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  6. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  7. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  8. Software Tools for Lifetime Assessment of Thermal Barrier Coatings Part I — Thermal Ageing Failure and Thermal Fatigue Failure

    NASA Astrophysics Data System (ADS)

    Renusch, Daniel; Rudolphi, Mario; Schütze, Michael

    Thermal barrier coatings (TBCs) increase the service lifetime of specific components in, for example, gas turbines or airplane engines and allow higher operating temperatures to increase efficiency. Lifetime prediction models are therefore of both academic and applied interest; either to test new coatings or to determine operational conditions that can ensure a certain lifetime, for example 25,000 hr for gas turbines. Driven by these demands, the equations used in lifetime prediction have become more and more sophisticated and consequently are complicated to apply. A collection of software tools for lifetime assessment was therefore developed to provide an easy to use graphical user interface whilst incorporating the recent improvements in modeling equations. The Windows based software is compatible with other Windows applications, such as, Power Point, Excel, or Origin. Laboratory lifetime data from isothermal, thermal cyclic and/or burner rig testing can be loaded into the software for analysis and the program provides confidence limits and an accuracy assessment of the analysis model. The main purpose of the software tool is to predict TBC spallation for a given bond coat temperature, temperature gradient across the coating, and thermal cycle frequency.

  9. Numerical arc segmentation algorithm for a radio conference - A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    A detailed description of a Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software package for communication satellite systems planning is presented. This software provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC - 88) on the use of the GEO and the planning of space services utilizing GEO. The features of the NASARC software package are described, and detailed information is given about the function of each of the four NASARC program modules. The results of a sample world scenario are presented and discussed.

  10. Illoura™: a software tool for analysis, visualization and semantic querying of cellular and other spatial biological data

    PubMed Central

    McComb, Tim; Cairncross, Oliver; Noske, Andrew B.; Wood, David L. A.; Marsh, Brad J.; Ragan, Mark A.

    2009-01-01

    Summary: New high-resolution approaches for mapping ultrastructure of cells in 3D are leading to unprecedented quantities of spatial data. Here we present Illoura, a software tool for the integrated management, analysis and visualization of these data within a semantic context, and illustrate its capability by analysis of spatial relationships in mammalian beta cells. Availability: http://www.visiblecell.com/illoura Contact: m.ragan@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19258351

  11. Productivity, part 2: cloud storage, remote meeting tools, screencasting, speech recognition software, password managers, and online data backup.

    PubMed

    Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Lalwani, Neeraj; Lall, Chandana; Bhargava, Puneet

    2014-06-01

    It is an opportune time for radiologists to focus on personal productivity. The ever increasing reliance on computers and the Internet has significantly changed the way we work. Myriad software applications are available to help us improve our personal efficiency. In this article, the authors discuss some tools that help improve collaboration and personal productivity, maximize e-learning, and protect valuable digital data. PMID:24674716

  12. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  13. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    NASA Astrophysics Data System (ADS)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  14. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    USGS Publications Warehouse

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G., III

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  15. New EPA 'PLUS' software: A useful tool for local emergency planners

    SciTech Connect

    Anastas, P.T.; Tobin, P.S.

    1993-09-01

    EPA's Office of Pollution Prevention and Toxics has produced a new bibliographic database designed for local emergency planners. The PLUS'' (Planner's Library in User-friendly Software) system is a resource database containing abstracts of hundreds of references useful in planning for hazardous substances emergencies. The software's structure allows planners to construct customized search strategies for generating lists of emergency planning-related references on subjects ranging from chemical profiles to personal protective gear.

  16. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  17. Gammasphere software development. Progress report

    SciTech Connect

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  18. Usefulness of anterior uveitis as an additional tool for diagnosing incomplete Kawasaki disease

    PubMed Central

    Lee, Kyu Jin; Kim, Hyo Jin; Kim, Min Jae; Yoon, Ji Hong; Lee, Eun Jung; Lee, Jae Young; Oh, Jin Hee; Lee, Soon Ju; Lee, Kyung Yil

    2016-01-01

    Purpose There are no specific tests for diagnosing Kawasaki disease (KD). Additional diagnostic criteria are needed to prevent the delayed diagnosis of incomplete Kawasaki disease (IKD). This study compared the frequency of coronary artery lesions (CALs) in IKD patients with and without anterior uveitis (AU) and elucidated whether the finding of AU supported the diagnosis of IKD. Methods This study enrolled patients diagnosed with IKD at The Catholic University of Korea, Uijeongbu St. Mary's Hospital from January 2010 to December 2014. The patients were divided into 2 groups: group 1 included patients with IKD having AU; and group 2 included patients with IKD without AU. We analyzed the demographic and clinical data (age, gender, duration of fever, and the number of diagnostic criteria), laboratory results, and echocardiographic findings. Results Of 111 patients with IKD, 41 had uveitis (36.98%, group 1) and 70 did not (63.02%, group 2). Patients in group 1 had received a diagnosis and treatment earlier, and had fewer CALs (3 of 41, 1.7%) than those in group 2 (20 of 70, 28.5%) (P=0.008). All 3 patients with CALs in group 1 had coronary dilatation, while patients with CALs in group 2 had CALs ranging from coronary dilatation to giant aneurysm. Conclusion The diagnosis of IKD is challenging but can be supported by the presence of features such as AU. Group 1 had a lower risk of coronary artery disease than group 2. Therefore, the presence of AU is helpful in the early diagnosis and treatment of IKD and can be used as an additional diagnostic tool. PMID:27186227

  19. MapNext: a software tool for spliced and unspliced alignments and SNP detection of short sequence reads

    PubMed Central

    2009-01-01

    Background Next-generation sequencing technologies provide exciting avenues for studies of transcriptomics and population genomics. There is an increasing need to conduct spliced and unspliced alignments of short transcript reads onto a reference genome and estimate minor allele frequency from sequences of population samples. Results We have designed and implemented MapNext, a software tool for both spliced and unspliced alignments of short sequence reads onto reference sequences, and automated SNP detection using neighbourhood quality standards. MapNext provides four main analyses: (i) unspliced alignment and clustering of reads, (ii) spliced alignment of transcript reads over intron boundaries, (iii) SNP detection and estimation of minor allele frequency from population sequences, and (iv) storage of result data in a database to make it available for more flexible queries and for further analyses. The software tool has been tested using both simulated and real data. Conclusion MapNext is a comprehensive and powerful tool for both spliced and unspliced alignments of short reads and automated SNP detection from population sequences. The simplicity, flexibility and efficiency of MapNext makes it a valuable tool for transcriptomic and population genomic research. PMID:19958476

  20. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    NASA Astrophysics Data System (ADS)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  1. Comparison of a Web-Based Dietary Assessment Tool with Software for the Evaluation of Dietary Records

    PubMed Central

    BENEDIK, Evgen; KOROUŠIĆ SELJAK, Barbara; HRIBAR, Maša; ROGELJ, Irena; BRATANIČ, Borut; OREL, Rok; FIDLER MIS, Nataša

    2015-01-01

    Background Dietary assessment in clinical practice is performed by means of computer support, either in the form of a web-based tool or software. The aim of the paper is to present the results of the comparison of a Slovenian web-based tool with German software for the evaluation of four-day weighted paper-and-pencil-based dietary records (paper-DRs) in pregnant women. Methods A volunteer group of pregnant women (n=63) completed paper-DRs. These records were entered by an experienced research dietitian into a web-based application (Open Platform for Clinical Nutrition, OPEN, http://opkp.si/en, Ljubljana, Slovenia) and software application (Prodi 5.7 Expert plus, Nutri-Science, Stuttgart, Germany, 2011). The results for calculated energy intake, as well as 45 macro- and micronutrient intakes, were statistically compared by using the non-parametric Spearman’s rank correlation coefficient. The cut-off for Spearman’s rho was set at >0.600. Results 12 nutritional parameters (energy, carbohydrates, fat, protein, water, potassium, calcium, phosphorus, dietary fiber, vitamin C, folic acid, and stearic acid) were in high correlation (>0.800), 18 in moderate (0.600–0.799), 11 in weak correlation (0.400–0.599), while 5 (arachidonic acid, niacin, alpha-linolenic acid, fluoride, total sugars) did not show any statistical correlation. Conclusion Comparison of the results of the evaluation of dietary records using a web-based dietary assessment tool with those using software shows that there is a high correlation for energy and macronutrient content.

  2. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis

    PubMed Central

    Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages (‘Edgewarp’ and ‘Morpho’) for the same sliding task, and investigate potential differences in the results and biological interpretation. ‘Morpho’ is much faster than ‘Edgewarp,’ notably as a result of the greater computational power of the ‘Morpho’ software routines and the complexity of the ‘Edgewarp’ workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  3. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis.

    PubMed

    Botton-Divet, Léo; Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages ('Edgewarp' and 'Morpho') for the same sliding task, and investigate potential differences in the results and biological interpretation. 'Morpho' is much faster than 'Edgewarp,' notably as a result of the greater computational power of the 'Morpho' software routines and the complexity of the 'Edgewarp' workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  4. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    SciTech Connect

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  5. Development of a case tool to support decision based software development

    NASA Technical Reports Server (NTRS)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  6. ARCHER, a New Monte Carlo Software Tool for Emerging Heterogeneous Computing Environments

    NASA Astrophysics Data System (ADS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2014-06-01

    The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented.

  7. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  8. Tool Match. Review Software for Basic CHOICE. CHOICE (Challenging Options in Career Education).

    ERIC Educational Resources Information Center

    Pitts, Ilse M.; And Others

    CHOICE Tool Match is an Apple computer concentration-type activity in which learners select two numbered windows in an attempt to match the tools displayed, reinforcing job and role information presented in the CHOICE Basic Job and Role activity folders and workbooks for migrant students. In place of written directions, the learner is provided…

  9. Software tool for the analysis and visualization of whole genome alignments

    2011-08-01

    GenomeVISTA is a tool which performs and displays pairwise and multiple whole genome DNA alignments. The tools provides a graphical user interface by which users can navigate alignments and multiple levels of resolution and get imformation about individual aligned regions. Users can load their own sequences into GenomeVISTA or view pre-computed alignments for genomes in the VISTA database.

  10. The Rat Genome Database curation tool suite: a set of optimized software tools enabling efficient acquisition, organization, and presentation of biological data

    PubMed Central

    Laulederkind, Stanley J. F.; Shimoyama, Mary; Hayman, G. Thomas; Lowry, Timothy F.; Nigam, Rajni; Petri, Victoria; Smith, Jennifer R.; Wang, Shur-Jen; de Pons, Jeff; Kowalski, George; Liu, Weisong; Rood, Wes; Munzenmaier, Diane H.; Dwinell, Melinda R.; Twigger, Simon N.; Jacob, Howard J.

    2011-01-01

    The Rat Genome Database (RGD) is the premier repository of rat genomic and genetic data and currently houses over 40 000 rat gene records as well as human and mouse orthologs, 1771 rat and 1911 human quantitative trait loci (QTLs) and 2209 rat strains. Biological information curated for these data objects includes disease associations, phenotypes, pathways, molecular functions, biological processes and cellular components. A suite of tools has been developed to aid curators in acquiring and validating data objects, assigning nomenclature, attaching biological information to objects and making connections among data types. The software used to assign nomenclature, to create and edit objects and to make annotations to the data objects has been specifically designed to make the curation process as fast and efficient as possible. The user interfaces have been adapted to the work routines of the curators, creating a suite of tools that is intuitive and powerful. Database URL: http://rgd.mcw.edu PMID:21321022

  11. OutbreakTools: A new platform for disease outbreak analysis using the R software

    PubMed Central

    Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Höhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

    2014-01-01

    The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

  12. The Facial Aesthetic index: An additional tool for assessing treatment need

    PubMed Central

    Sundareswaran, Shobha; Ramakrishnan, Ranjith

    2016-01-01

    Objectives: Facial Aesthetics, a major consideration in orthodontic diagnosis and treatment planning, may not be judged correctly and completely by simply analyzing dental occlusion or osseous structures. Despite this importance, there is no index to guarantee availability of treatment or prioritize patients based on their soft tissue treatment needs. Individuals having well-aligned teeth but unaesthetic convex profiles do not get included for treatment as per current malocclusion indices. The aim of this investigation is to develop an aesthetic index based on facial profiles which could be used as an additional tool with malocclusion indices. Materials and Methods: A chart showing typical facial profile changes due to underlying malocclusions was generated by soft tissue manipulations of standardized profile photographs of a well-balanced male and female face. A panel of 62 orthodontists judged the profile photographs of 100 patients with different soft tissue patterns for assessing profile variations and treatment need. The index was later tested in a cross-section of school population. Statistical analysis was done using “irr” package of R environment version 2.15.1. Results: The index exhibited very good reliability in determining profile variations (Fleiss kappa 0.866, P < 0.001), excellent reproducibility (kappa 0.9078), high sensitivity, and specificity (95.7%). Testing in population yielded excellent agreement among orthodontists (kappa 0.9286). Conclusions: A new Facial Aesthetic index, based on patient's soft tissue profile requirements is proposed, which can complement existing indices to ensure treatment to those in need. PMID:27127752

  13. Evaluating Difficulty Levels of Dynamic Geometry Software Tools to Enhance Teachers' Professional Development

    ERIC Educational Resources Information Center

    Hohenwarter, Judith; Hohenwarter, Markus; Lavicza, Zsolt

    2010-01-01

    This paper describes a study aimed to identify commonly emerging impediments related to the introduction of dynamic mathematics software. We report on the analysis of data collected during a three-week professional development programme organised for middle and high school teachers in Florida. The study identified challenges that participants face…

  14. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    ERIC Educational Resources Information Center

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  15. Designing, Developing and Implementing a Software Tool for Scenario Based Learning

    ERIC Educational Resources Information Center

    Norton, Geoff; Taylor, Mathew; Stewart, Terry; Blackburn, Greg; Jinks, Audrey; Razdar, Bahareh; Holmes, Paul; Marastoni, Enrique

    2012-01-01

    The pedagogical value of problem-based and inquiry-based learning activities has led to increased use of this approach in many courses. While scenarios or case studies were initially presented to learners as text-based material, the development of modern software technology provides the opportunity to deliver scenarios as e-learning modules,…

  16. TeraTools: Multiparameter data acquisition software for the Windows 95/NT OS

    SciTech Connect

    Piercey, R.B.

    1997-12-31

    TeraTools, a general purpose, multiparameter, data acquisition application for Windows 95NT is described. It is based on the Kmax architecture which has been used since 1986 on the Macintosh computer at numerous industrial, education, and research sites world-wide. TeraTools includes high-level support for industry-standard modular instrumentation; a built-in scripting language; drivers for commercially available interfaces; hooks for external code extensions; event file sorting and replay; and a full set of histogramming and display tools. The environment is scalable and may be applied to problems involving a few parameters or many parameters.

  17. AcquiTools: A new Software Toolkit for the Efficient Preparation of DMC-Ready Waveform Data

    NASA Astrophysics Data System (ADS)

    Golden, S.

    2009-12-01

    Many Seismic projects make use of the excellent infrastructure provided by the IRIS Data Management Center (DMC) for archival and distribution of waveform data. This usually requires the data to be submitted to the DMC in SEED (Standard for the Exchange of Earthquake Data) format. Therefore some current data loggers are already recording data in a waveform-only subset of the SEED format called miniSEED. Nevertheless, recordings from other data loggers, such as the Reftek RT130, first need to be converted. One standard procedure to do this for RT130 data involves a software package distributed by the IRIS PASSCAL Instrument Center. Its use requires the sequential application of a minimum of two conversion programs. This number of conversion programs will increase, if more than a minimum of manipulations is needed. A new tool, named “ckreftekt”, was developed to combine several of these processing steps into one. Thereby, it simplifies the low-level data processing to a point, where a relatively simple shell script is sufficient to process the data set of an entire experiment. This makes the low-level data processing automatically reproducible. As side effects, computation time and disk usage are significantly reduced. So far the tool has only been used in-house. Thereby, more than 2 TB of waveform data have been processed and submitted to the DMC, mostly from the High Lava Plains (HLP), and Structural Change projects. The program “ckreftek” is part of a larger new software toolkit named AcquiTools, which attempts to simplify a series of similar low-level data handling processes. This contribution is an attempt to introduce AcquiTools as an open source tool to the community, and to gather feedback on where its development should be headed in the future.

  18. CREATING INTEROPERABLE MESHING AND DISCRETIZATION SOFTWARE: THE TERASCALE SIMULATION TOOLS AND TECHNOLOGY CENTER.

    SciTech Connect

    BROWN,D.; FREITAG,L.; GLIMM,J.

    2002-06-02

    We present an overview of the technical objectives of the Terascale Simulation Tools and Technologies center. The primary goal of this multi-institution collaboration is to develop technologies that enable application scientists to easily use multiple mesh and discretization strategies within a single simulation on terascale computers. The discussion focuses on our efforts to create interoperable mesh generation tools, high-order discretization techniques, and adaptive meshing strategies.

  19. Creating Interoperable Meshing and Discretization Software: The Terascale Simulation Tools and Technology Center

    SciTech Connect

    Brown, D.; Freitag, L.; Glimm, J.

    2002-03-28

    We present an overview of the technical objectives of the Terascale Simulation Tools and Technologies center. The primary goal of this multi-institution collaboration is to develop technologies that enable application scientists to easily use multiple mesh and discretization strategies within a single simulation on terascale computers. The discussion focuses on our efforts to create interoperable mesh generation tools, high-order discretization techniques, and adaptive meshing strategies.

  20. RDFTools: a software tool for quantifying short-range ordering in amorphous materials.

    PubMed

    Mitchell, D R G; Petersen, T C

    2012-02-01

    A software package for computing radial distribution functions and other pair correlation functions from electron diffraction patterns of disordered solids is presented. The package, called RDFTools, is freely available via the internet and allows rapid in situ measurements of such quantities as interatomic nearest neighbor distances, average bond angles and coordination numbers. The software runs under DigitalMicrograph™ (Pleasanton, California, Gatan), a very widely used program in transmission electron microscopy. All implemented algorithms have been designed to compute diffraction integrals and data-processing averages in a fast and efficient manner to enable quick processing of publication ready, quantitative pair distribution function information. In the development of RDFTools, significant attention was paid to provide a robust and intuitive user-interface for deriving reliable semiquantitative information. For example, RDFTools enables accurate pair separation distances to be revealed upon immediate interrogation at the microscope; even for potentially thick specimens and/or regions of unknown elemental composition. PMID:21761497

  1. The Image-Guided Surgery ToolKit IGSTK: an open source C++ software toolkit

    NASA Astrophysics Data System (ADS)

    Cheng, Peng; Ibanez, Luis; Gobbi, David; Gary, Kevin; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Zhang, Hui; Kim, Hee-su; Blake, M. Brian; Cleary, Kevin

    2007-03-01

    The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine architecture. This paper presents an overview of the project based on a recent book which can be downloaded from igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process and the best practices that were developed, and an overview of requirements. The paper also presents the architecture framework and main components. This presentation is followed by a discussion of the state machine model that was incorporated and the associated rationale. The paper concludes with an example application.

  2. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets

    PubMed Central

    Johnson, Z. P.; Eady, R. D.; Ahmad, S. F.; Agravat, S.; Morris, T; Else, J; Lank, S. M.; Wiseman, R. W.; O’Connor, D. H.; Penedo, M. C. T.; Larsen, C. P.

    2012-01-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permitsmultiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox onWindows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie. kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo, user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  3. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    PubMed

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  4. DeconMSn: A Software Tool for accurate parent ion monoisotopic mass determination for tandem mass spectra

    SciTech Connect

    Mayampurath, Anoop M.; Jaitly, Navdeep; Purvine, Samuel O.; Monroe, Matthew E.; Auberry, Kenneth J.; Adkins, Joshua N.; Smith, Richard D.

    2008-04-01

    We present a new software tool for tandem MS analyses that: • accurately calculates the monoisotopic mass and charge of high–resolution parent ions • accurately operates regardless of the mass selected for fragmentation • performs independent of instrument settings • enables optimal selection of search mass tolerance for high mass accuracy experiments • is open source and thus can be tailored to individual needs • incorporates a SVM-based charge detection algorithm for analyzing low resolution tandem MS spectra • creates multiple output data formats (.dta, .MGF) • handles .RAW files and .mzXML formats • compatible with SEQUEST, MASCOT, X!Tandem

  5. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    NASA Astrophysics Data System (ADS)

    Marshall, I. P. G.

    2014-12-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work with a wide range of core and section configurations and can thus be used in future drilling projects. Corganiser is written in the Python programming language and is implemented both as a graphical web interface and command-line interface. It can be accessed online at http://130.226.247.137/.

  6. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    NASA Astrophysics Data System (ADS)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  7. USER'S GUIDE: Strategic Waste Minimization Initiative (SWAMI) Version 2.0 - A Software Tool to Aid in Process Analysis for Pollution Prevention

    EPA Science Inventory

    The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...

  8. An Automated Software Package for the KISS Objective-Prism Survey for Emission-Line Galaxies. II. Recent Additions and Project Status

    NASA Astrophysics Data System (ADS)

    Frattare, L. M.; Salzer, J. J.

    1996-05-01

    We present an update on the KPNO International Spectroscopic Survey (KISS) project. KISS is a wide-field survey for extragalactic emission-line objects being carried out with the Burrell Schmidt at Kitt Peak. While we are utilizing the classical objective-prism technique to find strong-lined star-forming galaxies and AGNs, the use of CCD detectors and automated reduction software promise to make KISS a powerful tool for the study of activity in galaxies. We are currently completing our first survey strip (100 square degrees). The data consist of deep (to B = 20) objective-prism images, deep direct images in both B and V, and small-format photometric calibration images of each field. The KISS reduction package was designed to run under the IRAF image processing environment, and will eventually grow to be a complete IRAF package. Tasks added to the package over the past year include precise astrometry and photometry modules. The astrometry routines utilize the HST Guide Star Catalog to perform a full plate solution on the direct image of each Schmidt field, and then assign accurate equatorial coordinates to each object in the field. The photometry module performs aperture photometry on the direct images for all objects in the KISS database catalog, and provides routines to transfer the photometry calibration from the small-format images taken under photometric conditions to the large-format survey images. Extensive tests and modifications have also been carried out on the pre-existing software described by Herrero & Salzer (1995) in order to better fine-tune the reduction procedures and parameter settings. In addition to presenting a complete description of the new software, we describe the current status of the survey and present some preliminary characteristics of the sample. Other members of the KISS project include V. Lipovetsky & A. Kniazev (S.A.O.), T. Boroson (NOAO/USGP), T. Thuan (U. Virginia), J. Moody (BYU), Y. Izotov (Ukrainian Acad. Sci.), and J. Herrero

  9. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    NASA Technical Reports Server (NTRS)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  10. Software Solutions for ICME

    NASA Astrophysics Data System (ADS)

    Schmitz, G. J.; Engstrom, A.; Bernhardt, R.; Prahl, U.; Adam, L.; Seyfarth, J.; Apel, M.; de Saracibar, C. Agelet; Korzhavyi, P.; Ågren, J.; Patzak, B.

    2016-01-01

    The Integrated Computational Materials Engineering expert group (ICMEg), a coordination activity of the European Commission, aims at developing a global and open standard for information exchange between the heterogeneous varieties of numerous simulation tools. The ICMEg consortium coordinates respective developments by a strategy of networking stakeholders in the first International Workshop on Software Solutions for ICME, compiling identified and relevant software tools into the Handbook of Software Solutions for ICME, discussing strategies for interoperability between different software tools during a second (planned) international workshop, and eventually proposing a scheme for standardized information exchange in a future book or document. The present article summarizes these respective actions to provide the ICME community with some additional insights and resources from which to help move this field forward.

  11. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    NASA Astrophysics Data System (ADS)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E.

    2010-07-01

    Ionic Polymer Polymer Composites (IP2Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP2C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP2Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP2C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP2C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  12. Experiences and perspectives with SRI's tools for software design and validation

    NASA Technical Reports Server (NTRS)

    Goguen, J.; Levitt, K. N.

    1982-01-01

    Development of tools that include the STP theorem poer and its associated Design Verification Systems; PHIL, a meta-programmable context sensitive structured editor; Pegasus, a system for support of graphical programming; and OBJ, an ultra high level programming language based on rewrite rules and abstract data type is reported.

  13. Photovoltaic array performance and life-cycle cost simulation using new software tools

    NASA Technical Reports Server (NTRS)

    Daniel, R. E.; Burger, D. R.; Reiter, L. J.

    1985-01-01

    The three computer models, SAMICS, PVARRAY, and LCP can be used together as a single analytical tool to compare the lifetime economic value of a photovoltaic (PV) array. This evaluation can be used to compare various module and array configurations and the performance characteristics of different module manufacturing technologies.

  14. What software tools can I use to view ERBE HDF data products?

    Atmospheric Science Data Center

    2014-12-08

    Visualize ERBE data with view_hdf: view_hdf a visualization and analysis tool for accessing data stored in Hierarchical Data Format (HDF) and HDF-EOS. ... Start HDFView Select File Select Open Select the file to be viewed ERBE: Data Access ...

  15. Word-Tool Match. Review Software for Basic CHOICE. CHOICE (Challenging Options in Career Education).

    ERIC Educational Resources Information Center

    Pitts, Ilse M.; And Others

    CHOICE Word-Tool Match provides migrant youth the opportunity to use the computer in self-directed ways, while reinforcing job and role information presented in Basic Job and Role activity folders and workbooks. Learners select whether to play with one or two players, the career that will provide the theme for the game, and whether to play the…

  16. Final Report "CoDeveloper: A Secure Web-Invocable Collaborative Software Development Tool"

    SciTech Connect

    Svetlana Shasharina

    2005-11-27

    Modern scientific simulations generate large datasets at remote sites with appropriate resources (supercomputers and clusters). Bringing these large datasets to the computers of all members of a distributed team of collaborators is often impractical or even impossible: there might not be enough bandwidth, storage capacity or appropriate data analysis and visualization tools locally available. To address the need to access remote data, avoid heavy Internet traffic and unnecessary data replication, Tech-X Corporation developed a tool, which allows running remote data visualization collaboratively and sharing the visualization objects as they get generated. The size of these objects is typically much smaller than the size of the original data. For marketing reasons, we renamed the product CoReViz. The detailed information on this product can be found at http://www.txcorp.com/products/CoReViz/. We installed and tested this tool at multiple machines at Tech-X and on seaborg at NERSC. In what follows, we give a detailed description of this tool.

  17. Blogs and Wikis as Instructional Tools: A Social Software Adaptation of Just-in-Time Teaching

    ERIC Educational Resources Information Center

    Higdon, Jude; Topaz, Chad

    2009-01-01

    Just-in-Time Teaching (JiTT) methodology uses Web-based tools to gather student responses to questions on preclass reading assignments. However, the technological requirements of JiTT and the content-specific nature of the questions may prevent some instructors from implementing it. Our own JiTT implementation uses publicly and freely available…

  18. The Viability of a Software Tool to Assist Students in the Review of Literature

    ERIC Educational Resources Information Center

    Anderson, Timothy R.

    2013-01-01

    Most doctoral students are novice researchers and may not possess the skills to effectively conduct a comprehensive review of the literature and frame a problem designed to conduct original research. Students need proper training and tools necessary to critically evaluate, synthesize and organize literature. The purpose of this concurrent mixed…

  19. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  20. MoRFchibi SYSTEM: software tools for the identification of MoRFs in protein sequences.

    PubMed

    Malhis, Nawar; Jacobson, Matthew; Gsponer, Jörg

    2016-07-01

    Molecular recognition features, MoRFs, are short segments within longer disordered protein regions that bind to globular protein domains in a process known as disorder-to-order transition. MoRFs have been found to play a significant role in signaling and regulatory processes in cells. High-confidence computational identification of MoRFs remains an important challenge. In this work, we introduce MoRFchibi SYSTEM that contains three MoRF predictors: MoRFCHiBi, a basic predictor best suited as a component in other applications, MoRFCHiBi_ Light, ideal for high-throughput predictions and MoRFCHiBi_ Web, slower than the other two but best for high accuracy predictions. Results show that MoRFchibi SYSTEM provides more than double the precision of other predictors. MoRFchibi SYSTEM is available in three different forms: as HTML web server, RESTful web server and downloadable software at: http://www.chibi.ubc.ca/faculty/joerg-gsponer/gsponer-lab/software/morf_chibi/. PMID:27174932

  1. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  2. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    SciTech Connect

    Chen, G; Ahunbay, E; Li, X

    2015-06-15

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during the actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool

  3. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  4. Testing the reliability of software tools in sex and ancestry estimation in a multi-ancestral Brazilian sample.

    PubMed

    Urbanová, Petra; Ross, Ann H; Jurda, Mikoláš; Nogueira, Maria-Ines

    2014-09-01

    In the framework of forensic anthropology osteometric techniques are generally preferred over visual examinations due to a higher level of reproducibility and repeatability; qualities that are crucial within a legal context. The use of osteometric methods has been further reinforced by incorporating statistically-based algorithms and large reference samples in a variety of user-friendly software applications. However, the continued increase in admixture of human populations have made the use of osteometric methods for estimation of ancestry much more complex, which confounds one of major requirements of ancestry assessment - intra-population homogeneity. The present paper tests the accuracy of ancestry and sex assessment using four identification software tools, specifically FORDISC 2.0, FORDISC 3.1.293, COLIPR 1.5.2 and 3D-ID 1.0. Software accuracy was tested in a sample of 174 documented human crania of Brazilian origin composed of different ancestral groups (i.e., European Brazilians, Afro-Brazilians, and Japanese Brazilians and of admixed ancestry). The results show that regardless of the software algorithm employed and composition of the reference database, all methods were able to allocate approximately 50% of Brazilian specimens to an appropriate major reference group. Of the three ancestral groups, Afro-Brazilians were especially prone to misclassification. Japanese Brazilians, by contrast, were shown to be relatively easily recognizable as being of Asian descent but at the same time showed a strong affinity towards Hispanic crania, in particularly when the classification based on FDB was carried out in FORDISC. For crania of admixed origin all of the algorithms showed a considerable higher rate of inconsistency with a tendency for misclassification into Asian and American Hispanic groups. Sex assessments revealed an overall modest to poor reliability (60-71% of correctly classified specimens) using the tested software programs with unbalanced individual

  5. Using Teamcenter engineering software for a successive punching tool lifecycle management

    NASA Astrophysics Data System (ADS)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  6. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  7. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    NASA Technical Reports Server (NTRS)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  8. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare.

    PubMed

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient's perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up. PMID:24199051

  9. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare

    PubMed Central

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient’s perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up. PMID:24199051

  10. Software Tools For Building Decision-support Models For Flood Emergency Situations

    NASA Astrophysics Data System (ADS)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  11. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    ERIC Educational Resources Information Center

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  12. FlowCal: A User-Friendly, Open Source Software Tool for Automatically Converting Flow Cytometry Data from Arbitrary to Calibrated Units.

    PubMed

    Castillo-Hair, Sebastian M; Sexton, John T; Landry, Brian P; Olson, Evan J; Igoshin, Oleg A; Tabor, Jeffrey J

    2016-07-15

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, nonproprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae Venus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond. PMID:27110723

  13. SOFTWARE TOOLS THAT ADDRESS HAZARDOUS MATERIAL ISSUES DURING NUCLEAR FACILITY D and D

    SciTech Connect

    M. COURNOYER; R. GRUNDEMANN

    2001-03-01

    The 49-year-old Chemistry and Metallurgy Research (CMR) Facility is where analytical chemistry and metallurgical studies on samples of plutonium and nuclear materials are conduct in support of the Department of Energy's nuclear weapons program. The CMR Facility is expected to be decontaminated and decommissioned (D and D) over the next ten to twenty years. Over the decades, several hazardous material issues have developed that need to be address. Unstable chemicals must be properly reassigned or disposed of from the workspace during D and D operation. Materials that have critical effects that are primarily chronic in nature, carcinogens, reproductive toxin, and materials that exhibit high chronic toxicity, have unique decontamination requirements, including the decontrolling of areas where these chemicals were used. Certain types of equipment and materials that contain mercury, asbestos, lead, and polychlorinated biphenyls have special provisions that must be addressed. Utilization of commercially available software programs for addressing hazardous material issues during D and D operations such as legacy chemicals and documentation are presented. These user-friendly programs eliminate part of the tediousness associated with the complex requirements of legacy hazardous materials. A key element of this approach is having a program that inventories and tracks all hazardous materials. Without an inventory of chemicals stored in a particular location, many important questions pertinent to D and D operations can be difficult to answer. On the other hand, a well-managed inventory system can address unstable and highly toxic chemicals and hazardous material records concerns before they become an issue. Tapping into the institutional database provides a way to take advantage of the combined expertise of the institution in managing a cost effective D and D program as well as adding a quality assurance element to the program. Using laboratory requirements as a logic flow

  14. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  15. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    NASA Astrophysics Data System (ADS)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  16. Regional Economic Accounting (REAcct). A software tool for rapidly approximating economic impacts

    SciTech Connect

    Ehlen, Mark Andrew; Vargas, Vanessa N.; Loose, Verne William; Starks, Shirley J.; Ellebracht, Lory A.

    2011-07-01

    This paper describes the Regional Economic Accounting (REAcct) analysis tool that has been in use for the last 5 years to rapidly estimate approximate economic impacts for disruptions due to natural or manmade events. It is based on and derived from the well-known and extensively documented input-output modeling technique initially presented by Leontief and more recently further developed by numerous contributors. REAcct provides county-level economic impact estimates in terms of gross domestic product (GDP) and employment for any area in the United States. The process for using REAcct incorporates geospatial computational tools and site-specific economic data, permitting the identification of geographic impact zones that allow differential magnitude and duration estimates to be specified for regions affected by a simulated or actual event. Using these data as input to REAcct, the number of employees for 39 directly affected economic sectors (including 37 industry production sectors and 2 government sectors) are calculated and aggregated to provide direct impact estimates. Indirect estimates are then calculated using Regional Input-Output Modeling System (RIMS II) multipliers. The interdependent relationships between critical infrastructures, industries, and markets are captured by the relationships embedded in the inputoutput modeling structure.

  17. CAGO: A Software Tool for Dynamic Visual Comparison and Correlation Measurement of Genome Organization

    PubMed Central

    Chang, Yi-Feng; Chang, Chuan-Hsiung

    2011-01-01

    CAGO (Comparative Analysis of Genome Organization) is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG) format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago. PMID:22114666

  18. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  19. Exon array data analysis using Affymetrix power tools and R statistical software

    PubMed Central

    2011-01-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform. PMID:21498550

  20. Software tools and preliminary design of a control system for the 40m OAN radiotelescope

    NASA Astrophysics Data System (ADS)

    de Vicente, P.; Bolaño, R.

    2004-07-01

    The Observatorio Astronómico Nacional (OAN) is building a 40m radiotelescope in its facilities in Yebes (Spain) which will be delivered by April 2004. The servosystem will be controlled by an ACU (Antenna Control Unit), a real time computer running VxWorks which will be commanded from a remote computer (RCC) or from a local computer (LCC) which will act as console. We present the tools we have chosen to develop and use the control system for the RCC and the criteria followed for the choices we made. We also present a preliminary design of the control system on which we are currently working. The RCC will run a server which communicates with the ACU using sockets and with the clients, receivers and backends using OmniOrb, a free implementation of CORBA. Clients running Python will allow the users to control the antenna from any host connected to a LAN or a secure Internet connection.

  1. ProViDE: A software tool for accurate estimation of viral diversity in metagenomic samples

    PubMed Central

    Ghosh, Tarini Shankar; Mohammed, Monzoorul Haque; Komanduri, Dinakar; Mande, Sharmila Shekhar

    2011-01-01

    Given the absence of universal marker genes in the viral kingdom, researchers typically use BLAST (with stringent E-values) for taxonomic classification of viral metagenomic sequences. Since majority of metagenomic sequences originate from hitherto unknown viral groups, using stringent e-values results in most sequences remaining unclassified. Furthermore, using less stringent e-values results in a high number of incorrect taxonomic assignments. The SOrt-ITEMS algorithm provides an approach to address the above issues. Based on alignment parameters, SOrt-ITEMS follows an elaborate work-flow for assigning reads originating from hitherto unknown archaeal/bacterial genomes. In SOrt-ITEMS, alignment parameter thresholds were generated by observing patterns of sequence divergence within and across various taxonomic groups belonging to bacterial and archaeal kingdoms. However, many taxonomic groups within the viral kingdom lack a typical Linnean-like taxonomic hierarchy. In this paper, we present ProViDE (Program for Viral Diversity Estimation), an algorithm that uses a customized set of alignment parameter thresholds, specifically suited for viral metagenomic sequences. These thresholds capture the pattern of sequence divergence and the non-uniform taxonomic hierarchy observed within/across various taxonomic groups of the viral kingdom. Validation results indicate that the percentage of ‘correct’ assignments by ProViDE is around 1.7 to 3 times higher than that by the widely used similarity based method MEGAN. The misclassification rate of ProViDE is around 3 to 19% (as compared to 5 to 42% by MEGAN) indicating significantly better assignment accuracy. ProViDE software and a supplementary file (containing supplementary figures and tables referred to in this article) is available for download from http://metagenomics.atc.tcs.com/binning/ProViDE/ PMID:21544173

  2. Pol(F)lux software, a dedicated tool to stream nutrient fluxes and uncertainties calculations for survey optimization

    NASA Astrophysics Data System (ADS)

    Moatar, F.; Curie, F.; Meybeck, M.

    2015-12-01

    Data on stream material fluxes are essential for calculating element cycles (carbon, nutrients, and pollutants) and erosion rates from local to global scales. In most water-quality stations throughout the world stream fluxes are calculated from daily flow data (Q) and discrete concentration data (C), the latter being often the main cause of large uncertainties. This paper present the Pol(F)lux software tool, which addresses with two major issues: i) the selection of the optimal (minimal uncertainties) flux calculation method among 8 methods based on the flux variability matrix. ii) for the the discharge-weighted concentration method (the most commonly used method and recommended in the international convention for the protection of the North Sea and the Northeast Atlantic, OSPAR Convention), sampling frequency can be predicted to achieve a specified level of precision from the flux variability indicator (M2%, cumulative material fluxes discharged during the upper 2% of highest daily fluxes) through a nomograph for sampling intervals of 3 to 60 days. The software was validated for water-quality stations in medium to large basins (basin area>500 km²). The flux variability matrix, the cornerstone of the Pol(F)lux software, is based on two indicators: (a) cumulative flow volume discharged during the upper 2% of highest daily flow, W2%, which characterizes the hydrological reactivity of the catchment during highest flow, and (b) the truncated b50sup exponent, calculated as the exponent of the relationship between concentration and discharge (in logarithmic scale) at the high-water stages (discharges greater than median flow), which characterize the behaviour of stream material. We postulate that performance is similar for stream materials found in the same flux variability class, composed of 4 classes of hydrological reactivity (W2%) and 5 classes of biogeochemical behavior (b50sup), defining 20 potential variability classes.

  3. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  4. JCoast – A biologist-centric software tool for data mining and comparison of prokaryotic (meta)genomes

    PubMed Central

    Richter, Michael; Lombardot, Thierry; Kostadinov, Ivaylo; Kottmann, Renzo; Duhaime, Melissa Beth; Peplies, Jörg; Glöckner, Frank Oliver

    2008-01-01

    Background Current sequencing technologies give access to sequence information for genomes and metagenomes at a tremendous speed. Subsequent data processing is mainly performed by automatic pipelines provided by the sequencing centers. Although, standardised workflows are desirable and useful in many respects, rational data mining, comparative genomics, and especially the interpretation of the sequence information in the biological context, demands for intuitive, flexible, and extendable solutions. Results The JCoast software tool was primarily designed to analyse and compare (meta)genome sequences of prokaryotes. Based on a pre-computed GenDB database project, JCoast offers a flexible graphical user interface (GUI), as well as an application programming interface (API) that facilitates back-end data access. JCoast offers individual, cross genome-, and metagenome analysis, and assists the biologist in exploration of large and complex datasets. Conclusion JCoast combines all functions required for the mining, annotation, and interpretation of (meta)genomic data. The lightweight software solution allows the user to easily take advantage of advanced back-end database structures by providing a programming and graphical user interface to answer biological questions. JCoast is available at the project homepage. PMID:18380896

  5. cdfs-sim, cdfs-extract, LFtools: new software tools for XMM-Newton and other missions

    NASA Astrophysics Data System (ADS)

    Ranalli, P.

    2014-07-01

    With the increasing size and complexity of data in modern astrophysics, software is playing a major role among the astronomer's tools. The public availability of code is key to allow a faster advancement of science, and to guarantee the reproducibility of published results. In this poster I will present a collection of programs which I have been developing as part of my research, which are being successfully used by different groups for their publications (XMM-CDFS, Stripe-82, XXL), and which I have publicly released as free software. While currently tuned to XMM-Newton, all of them are extensible to other missions. The list includes: cdfs-sim: a simulator of X-ray astronomical observations. It can simulate an arbitrary set of point sources and reproduce the XMM-Newton background, giving an event file which can be analyzed with SAS. cdfs-extract: a program to extract spectra for multiple sources in multiple XMM-Newton observations. LFtools: a set of programs to compute luminosity functions, with binned estimates, maximum likelihood fits and Bayesian parameter exploration.

  6. GPU-FS-kNN: A Software Tool for Fast and Scalable kNN Computation Using GPUs

    PubMed Central

    Arefin, Ahmed Shamsul; Riveros, Carlos; Berretta, Regina; Moscato, Pablo

    2012-01-01

    Background The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers). An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU), can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. Results We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN) search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour) for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50–60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. Conclusion Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN) provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL) at https://sourceforge.net/p/gpufsknn/. PMID:22937144

  7. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis

    PubMed Central

    Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range. PMID:26665161

  8. The DSET Tool Library: A software approach to enable data exchange between climate system models

    SciTech Connect

    McCormick, J.

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  9. CoCoTools: open-source software for building connectomes using the CoCoMac anatomical database.

    PubMed

    Blumenfeld, Robert S; Bliss, Daniel P; Perez, Fernando; D'Esposito, Mark

    2014-04-01

    Neuroanatomical tracer studies in the nonhuman primate macaque monkey are a valuable resource for cognitive neuroscience research. These data ground theories of cognitive function in anatomy, and with the emergence of graph theoretical analyses in neuroscience, there is high demand for these data to be consolidated into large-scale connection matrices ("macroconnectomes"). Because manual review of the anatomical literature is time consuming and error prone, computational solutions are needed to accomplish this task. Here we describe the "CoCoTools" open-source Python library, which automates collection and integration of macaque connectivity data for visualization and graph theory analysis. CoCoTools both interfaces with the CoCoMac database, which houses a vast amount of annotated tracer results from 100 years (1905-2005) of neuroanatomical research, and implements coordinate-free registration algorithms, which allow studies that use different parcellations of the brain to be translated into a single graph. We show that using CoCoTools to translate all of the data stored in CoCoMac produces graphs with properties consistent with what is known about global brain organization. Moreover, in addition to describing CoCoTools' processing pipeline, we provide worked examples, tutorials, links to on-line documentation, and detailed appendices to aid scientists interested in using CoCoTools to gather and analyze CoCoMac data. PMID:24116839

  10. Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications

    PubMed Central

    D’Souza, Malcolm J.; Kashmar, Richard J.; Hurst, Kent; Fiedler, Frank; Gross, Catherine E.; Deol, Jasbir K.; Wilson, Alora

    2015-01-01

    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004–2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average. PMID:26191337

  11. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  12. An automatic approach for calibrating dielectric bone properties by combining finite-element and optimization software tools.

    PubMed

    Su, Yukun; Kluess, Daniel; Mittelmeier, Wolfram; van Rienen, Ursula; Bader, Rainer

    2016-09-01

    The dielectric properties of human bone are one of the most essential inputs required by electromagnetic stimulation for improved bone regeneration. Measuring the electric properties of bone is a difficult task because of the complexity of the bone structure. Therefore, an automatic approach is presented to calibrate the electric properties of bone. The numerical method consists of three steps: generating input from experimental data, performing the numerical simulation, and calibrating the bone dielectric properties. As an example, the dielectric properties at 20 Hz of a rabbit distal femur were calibrated. The calibration process was considered as an optimization process with the aim of finding the optimum dielectric bone properties that match most of the numerically calculated simulation and experimentally measured data sets. The optimization was carried out automatically by the optimization software tool iSIGHT in combination with the finite-element solver COMSOL Multiphysics. As a result, the optimum conductivity and relative permittivity of the rabbit distal femur at 20 Hz were found to be 0.09615 S/m and 19522 for cortical bone and 0.14913 S/m and 1561507 for cancellous bone, respectively. The proposed method is a potential tool for the identification of realistic dielectric properties of the entire bone volume. The presented approach combining iSIGHT with COMSOL is applicable to, amongst others, designing implantable electro-stimulative devices or the optimization of electrical stimulation parameters for improved bone regeneration. PMID:26777343

  13. Use of proteolytic enzymes as an additional tool for trypanosomatid identification.

    PubMed

    Santos, A L S; Abreu, C M; Alviano, C S; Soares, R M A

    2005-01-01

    The expression of proteolytic activities in the Trypanosomatidae family was explored as a potential marker to discriminate between the morphologically indistinguishable flagellates isolated from insects and plants. We have comparatively analysed the proteolytic profiles of 19 monoxenous trypanosomatids (Herpetomonas anglusteri, H. samuelpessoai, H. mariadeanei, H. roitmani, H. muscarum ingenoplastis, H. muscarum muscarum, H. megaseliae, H. dendoderi, Herpetomoas sp., Crithidia oncopelti, C. deanei, C. acanthocephali, C. harmosa, C. fasciculata, C. guilhermei, C. luciliae, Blastocrithidia culicis, Leptomonas samueli and Lept. seymouri) and 4 heteroxenous flagellates (Phytomonas serpens, P. mcgheei, Trypanosoma cruzi and Leishmania amazonensis) by in situ detection of enzyme activities on sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE ) containing co-polymerized gelatine as substrate, in association with specific proteinase inhibitors. All 23 trypanosomatids expressed at least 1 acidic proteolytic enzyme. In addition, a characteristic and specific pattern of cell-associated metallo and/or cysteine proteinases was observed, except for the similar profiles detected in 2 Herpetomonas (H. anglusteri and H. samuelpessoai) and 3 Crithidia (C. fasciculata, C. guilhermei and C. luciliae) species. However, these flagellates released distinct secretory proteinase profiles into the extracellular medium. These findings strongly suggest that the association of cellular and secretory proteinase pattern could represent a useful marker to help trypanosomatid identification. PMID:15700759

  14. FusionFinder: A Software Tool to Identify Expressed Gene Fusion Candidates from RNA-Seq Data

    PubMed Central

    Francis, Richard W.; Thompson-Wicking, Katherine; Carter, Kim W.; Anderson, Denise; Kees, Ursula R.; Beesley, Alex H.

    2012-01-01

    The hallmarks of many haematological malignancies and solid tumours are chromosomal translocations, which may lead to gene fusions. Recently, next-generation sequencing techniques at the transcriptome level (RNA-Seq) have been used to verify known and discover novel transcribed gene fusions. We present FusionFinder, a Perl-based software designed to automate the discovery of candidate gene fusion partners from single-end (SE) or paired-end (PE) RNA-Seq read data. FusionFinder was applied to data from a previously published analysis of the K562 chronic myeloid leukaemia (CML) cell line. Using FusionFinder we successfully replicated the findings of this study and detected additional previously unreported fusion genes in their dataset, which were confirmed experimentally. These included two isoforms of a fusion involving the genes BRK1 and VHL, whose co-deletion has previously been associated with the prevalence and severity of renal-cell carcinoma. FusionFinder is made freely available for non-commercial use and can be downloaded from the project website (http://bioinformatics.childhealthresearch.org.au/software/fusionfinder/). PMID:22761941

  15. Gmat. A software tool for the computation of the rovibrational G matrix

    NASA Astrophysics Data System (ADS)

    Castro, M. E.; Niño, A.; Muñoz-Caro, C.

    2009-07-01

    . In addition, the program should handle the large number of files generated in massive explorations of molecular potential energy hypersurfaces. In these cases, Gmat will provide the G matrix as a function of the molecular structure. Solution method: To reach its objectives, Gmat has been organized in two components: an interface and a functional part. This organization allows for separating the input/output tasks, which are dependent on the human-machine interaction model selected, from the functional requirements, which are not. An object-oriented approach has been used in both parts. In the interface, polymorphism is used to allow the data acquisition from output files of different electronic structure codes. In the functional part, Gmat computes numerically the derivatives of the Cartesian coordinates respect to the vibrational coordinates needed to build the G matrix. Extremely accurate numerical derivatives are obtained in a double procedure. First, the truncation plus roundoff errors are minimized in the central differences expression. Second, the result is embedding in a nine levels Richardson extrapolation process. In the present version, the program allows the use of internal coordinates as vibrational coordinates, with the principal axes of inertia as body-fixed system. Running time: Sample test runs provided with the distribution take a few seconds to execute.

  16. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    NASA Astrophysics Data System (ADS)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    input structure (e.g., a checkerboard pattern) will be resolved by data for different types of earthquake-receiver geometries. Additionally, Larry3D, a three-dimensional seismic tomography tool contributed by Boschi, and NonLinLoc, a nonlinear earthquake relocation tool by Anthony Lomax, are both under development. The goal of all of the implemented modules is to aid in teaching research techniques, while remaining flexible enough for use in true research applications. In the long run, SEATREE may contribute to new ways of sharing scientific research, making published (numerical) experiments truly reproducible again. SEATREE can be downloaded as a package from http://geosys.usc.edu/projects/seatree/wiki/, and users can also subscribe to our Subversion project page. The software is designed to run on GNU/Linux based platforms and has also been successfully run on Mac OS-X. Our poster will present the four currently implemented modules, along with our design philosophies and implementation details.

  17. SlopMap: a software application tool for quick and flexible identification of similar sequences using exact k-mer matching.

    PubMed

    Zhbannikov, Ilya Y; Hunter, Samuel S; Settles, Matthew L; Foster, James A

    2013-08-01

    With the advent of Next-Generation (NG) sequencing, it has become possible to sequence a entire genomes quickly and inexpensively. However, in some experiments one only needs to extract and assembly a portion of the sequence reads, for example when performing transcriptome studies, sequencing mitochondrial genomes, or characterizing exomes. With the raw DNA-library of a complete genome it would appear to be a trivial problem to identify reads of interest. But it is not always easy to incorporate well-known tools such as BLAST, BLAT, Bowtie, and SOAP directly into a bioinformatics pipelines before the assembly stage, either due to incompatibility with the assembler's file inputs, or because it is desirable to incorporate information that must be extracted separately. For example, in order to incorporate flowgrams from a Roche 454 sequencer into the Newbler assembler it is necessary to first extract them from the original SFF files. We present SlopMap, a bioinformatics software utility that allows quickly identification similar to the provided reference reads from either Roche 454 or Illumnia DNA library. With simple and intuitive command-line interface along with file output formats compatible to assembly programs, SlopMap can be directly embedded to biological data processing pipeline without any additional programming work. In addition, SlopMap preserves flowgram information needed for Roche 454 assembler. PMID:24404406

  18. SlopMap: a software application tool for quick and flexible identification of similar sequences using exact k-mer matching

    PubMed Central

    Zhbannikov, Ilya Y.; Hunter, Samuel S.; Settles, Matthew L.; Foster, James A.

    2013-01-01

    With the advent of Next-Generation (NG) sequencing, it has become possible to sequence a entire genomes quickly and inexpensively. However, in some experiments one only needs to extract and assembly a portion of the sequence reads, for example when performing transcriptome studies, sequencing mitochondrial genomes, or characterizing exomes. With the raw DNA-library of a complete genome it would appear to be a trivial problem to identify reads of interest. But it is not always easy to incorporate well-known tools such as BLAST, BLAT, Bowtie, and SOAP directly into a bioinformatics pipelines before the assembly stage, either due to incompatibility with the assembler’s file inputs, or because it is desirable to incorporate information that must be extracted separately. For example, in order to incorporate flowgrams from a Roche 454 sequencer into the Newbler assembler it is necessary to first extract them from the original SFF files. We present SlopMap, a bioinformatics software utility that allows quickly identification similar to the provided reference reads from either Roche 454 or Illumnia DNA library. With simple and intuitive command-line interface along with file output formats compatible to assembly programs, SlopMap can be directly embedded to biological data processing pipeline without any additional programming work. In addition, SlopMap preserves flowgram information needed for Roche 454 assembler. PMID:24404406

  19. Developing a Generic Risk Assessment Simulation Modelling Software Tool for Assessing the Risk of Foot and Mouth Virus Introduction.

    PubMed

    Tameru, B; Gebremadhin, B; Habtemariam, T; Nganwa, D; Ayanwale, O; Wilson, S; Robnett, V; Wilson, W

    2008-06-01

    Foot and Mouth disease (FMD) is a highly contagious viral disease that affects all cloven-hoofed animals. Because of its devastating effects on the agricultural industry, many countries take measures to stop the introduction of FMD virus into their countries. Decision makers at multiple levels of the United States Department of Agriculture (USDA) use Risk Assessments (RAs) (both quantitative and qualitative) to make better and more informed scientifically based decisions to prevent the accidental or intentional introduction of the disease. There is a need for a generic RA that can be applied to any country (whether FMD free or non-FMD free) and for any product (FMD infected animals and animal products). We developed a user-friendly generic RA tool (software) that can be used to conduct and examine different scenarios of quantitative/qualitative risk assessments for the different countries with their varying FMD statuses in relation to reintroduction of FMD virus into the USA. The program was written in Microsoft Visual Basic 6.0 (Microsoft Corporation, Redmond, Washington, USA). The @Risk 6.1 Developer Kit (RDK) and @Risk 6.1 Best Fit Kit library (Palisade Corporation, Newfield, NY.USA) was used to build Monte Carlo simulation models. Microsoft Access 2000 (Microsoft Corporation, Redmond, Washington, USA) was used and SQL to query the data. Different input probability distributions can be selected for the nodes in the scenario tree and different output for each end-state of the simulation is given in different graphical formats and statistical values are used in describing the likelihood of FMD virus introduction. Sensitivity Analysis in determining which input factor has more effect on the total risk outputs is also given. The developed generic RA tools can be eventually extended and modified to conduct RAs for other animal diseases and animal products. PMID:25411550

  20. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    NASA Astrophysics Data System (ADS)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  1. Evaluation of three methods for retrospective correction of vignetting on medical microscopy images utilizing two open source software tools.

    PubMed

    Babaloukas, Georgios; Tentolouris, Nicholas; Liatis, Stavros; Sklavounou, Alexandra; Perrea, Despoina

    2011-12-01

    Correction of vignetting on images obtained by a digital camera mounted on a microscope is essential before applying image analysis. The aim of this study is to evaluate three methods for retrospective correction of vignetting on medical microscopy images and compare them with a prospective correction method. One digital image from four different tissues was used and a vignetting effect was applied on each of these images. The resulted vignetted image was replicated four times and in each replica a different method for vignetting correction was applied with fiji and gimp software tools. The highest peak signal-to-noise ratio from the comparison of each method to the original image was obtained from the prospective method in all tissues. The morphological filtering method provided the highest peak signal-to-noise ratio value amongst the retrospective methods. The prospective method is suggested as the method of choice for correction of vignetting and if it is not applicable, then the morphological filtering may be suggested as the retrospective alternative method. PMID:21950542

  2. Generator program for computer-assisted instruction: MACGEN. A software tool for generating computer-assisted instructional texts.

    PubMed

    Utsch, M J; Ingram, D

    1983-01-01

    This publication describes MACGEN, an interactive development tool to assist teachers to create, modify and extend case simulations, tutorial exercises and multiple-choice question tests designed for computer-aided instruction. The menu-driven software provides full authoring facilities for text files in MACAID format by means of interactive editing. Authors are prompted for items which they might want to change whereas all user-independent items are provided automatically. Optional default values and explanatory messages are available with every prompt. Errors are corrected automatically or commented upon. Thus the program eliminates the need to familiarize with a new language or details of the text file structure. The options for modification of existing text files include display, renumbering of frames and a line-oriented editor. The resulting text files can be interpreted by the MACAID driver without further changes. The text file is held as ASCII records and as such is also accessible with many standard word-processing systems if desired. PMID:6362978

  3. Software tools -- Man pages

    SciTech Connect

    1996-02-01

    Name, availability, synopsis, description, example, release and last change date are given for each of the following computer codes: DBLOADTEMPLATE(1); GDCT(1); SF2DB(1); SUBTOOL(1); DBLOADRECORDS(3); epvxiMsgLib(1); epvxiLib(1); freeList(1); gpHash(1); DBDATABASE(1); DBFILE(5); and TEMPLATEFILE(5).

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  5. NBC update: The addition of viral and fungal databases to the Naïve Bayes classification tool

    PubMed Central

    2012-01-01

    Background Classifying the fungal and viral content of a sample is an important component of analyzing microbial communities in environmental media. Therefore, a method to classify any fragment from these organisms' DNA should be implemented. Results We update the näive Bayes classification (NBC) tool to classify reads originating from viral and fungal organisms. NBC classifies a fungal dataset similarly to Basic Local Alignment Search Tool (BLAST) and the Ribosomal Database Project (RDP) classifier. We also show NBC's similarities and differences to RDP on a fungal large subunit (LSU) ribosomal DNA dataset. For viruses in the training database, strain classification accuracy is 98%, while for those reads originating from sequences not in the database, the order-level accuracy is 78%, where order indicates the taxonomic level in the tree of life. Conclusions In addition to being competitive to other classifiers available, NBC has the potential to handle reads originating from any location in the genome. We recommend using the Bacteria/Archaea, Fungal, and Virus databases separately due to algorithmic biases towards long genomes. The tool is publicly available at: http://nbc.ece.drexel.edu. PMID:22293603

  6. Recent Additions in the Modeling Capabilities of an Open-Source Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-04-20

    WEC-Sim is a midfidelity numerical tool for modeling wave energy conversion devices. The code uses the MATLAB SimMechanics package to solve multibody dynamics and models wave interactions using hydrodynamic coefficients derived from frequency-domain boundary-element methods. This paper presents the new modeling features introduced in the latest release of WEC-Sim. The first feature discussed conversion of the fluid memory kernel to a state-space form. This enhancement offers a substantial computational benefit after the hydrodynamic body-to-body coefficients are introduced and the number of interactions increases exponentially with each additional body. Additional features include the ability to calculate the wave-excitation forces based on the instantaneous incident wave angle, allowing the device to weathervane, as well as import a user-defined wave elevation time series. A review of the hydrodynamic theory for each feature is provided and the successful implementation is verified using test cases.

  7. Software attribute visualization for high integrity software

    SciTech Connect

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  8. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits. PMID:27053054

  9. Should software hold data hostage?

    SciTech Connect

    Wiley, H S.; Michaels, George S.

    2004-08-01

    Software tools have become an indispensable part of modern biology, but issues surrounding propriety file formats and closed software architectures threaten to stunt the growth of this rapidly expanding area of research. In an effort to ensure continuous software upgrades to provide a continuous income stream, some software companies have resorted to holding the user?s data hostage by locking them into proprietary file and data formats. Although this might make sense from a business perspective, it violates fundamental principles of data ownership and control. Such tactics should not be tolerated by the scientific community. The future of data-intensive biology depends on ensuring open data standards and freely exchangeable file formats. Compared to the engineering and chemistry fields, computers are a relatively recent addition to the arsenal of biological tools. Thus the pool of potential users of biology-oriented software is comparatively small. Biology itself is a broad field with many sub-disciplines, such as neurobiology, biochemistry, genomics and cell biology. This creates the need for task-oriented software tools that necessarily have a small user base. Simultaneously, the task of developing software has become more complex with the need for multi-platform software and increasing user expectations of sophisticated interfaces and a high degree of usability. Writing successful software in such an environment is very challenging, but progress in biology will increasingly depend on the success of companies and individuals in creating powerful new software tools. The trend to open source software could have an enormous impact on biology by providing the large number of specialized analysis tools that are required. Indeed, in the field of bioinformatics, open source software has become pervasive, largely because of the high degree of computer skill necessary for workers in this field. For these tools to be usable by non-specialists, however, requires the

  10. An open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and industrial CAM software.

    PubMed

    Lu, Li; Liu, Shusheng; Shi, Shenggen; Yang, Jianzhong

    2011-10-01

    China-made 5-axis simultaneous contouring CNC machine tool and domestically developed industrial computer-aided manufacture (CAM) technology were used for full crown fabrication and measurement of crown accuracy, with an attempt to establish an open CAM system for dental processing and to promote the introduction of domestic dental computer-aided design (CAD)/CAM system. Commercially available scanning equipment was used to make a basic digital tooth model after preparation of crown, and CAD software that comes with the scanning device was employed to design the crown by using domestic industrial CAM software to process the crown data in order to generate a solid model for machining purpose, and then China-made 5-axis simultaneous contouring CNC machine tool was used to complete machining of the whole crown and the internal accuracy of the crown internal was measured by using 3D-MicroCT. The results showed that China-made 5-axis simultaneous contouring CNC machine tool in combination with domestic industrial CAM technology can be used for crown making and the crown was well positioned in die. The internal accuracy was successfully measured by using 3D-MicroCT. It is concluded that an open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and domestic industrial CAM software has been established, and development of the system will promote the introduction of domestically-produced dental CAD/CAM system. PMID:22038364

  11. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater. PMID:23982824

  12. Tools

    Atmospheric Science Data Center

    2013-03-19

    ... leaving the NASA domain and are subject to the privacy and security policies of the owners/sponsors of the outside web site(s). NASA is not responsible for the information collection practices of non-NASA sites.   Read software ...

  13. NOTE: A software tool for 2D/3D visualization and analysis of phase-space data generated by Monte Carlo modelling of medical linear accelerators

    NASA Astrophysics Data System (ADS)

    Neicu, Toni; Aljarrah, Khaled M.; Jiang, Steve B.

    2005-10-01

    A computer program has been developed for novel 2D/3D visualization and analysis of the phase-space parameters of Monte Carlo simulations of medical accelerator radiation beams. The software is written in the IDL language and reads the phase-space data generated in the BEAMnrc/BEAM Monte Carlo code format. Contour and colour-wash plots of the fluence, mean energy, energy fluence, mean angle, spectra distribution, energy fluence distribution, angular distribution, and slices and projections of the 3D ZLAST distribution can be calculated and displayed. Based on our experience of using it at Massachusetts General Hospital, the software has proven to be a useful tool for analysis and verification of the Monte Carlo generated phase-space files. The software is in the public domain.

  14. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  15. TOPS: a versatile software tool for statistical analysis and visualization of combinatorial gene-gene and gene-drug interaction screens

    PubMed Central

    2014-01-01

    Background Measuring the impact of combinations of genetic or chemical perturbations on cellular fitness, sometimes referred to as synthetic lethal screening, is a powerful method for obtaining novel insights into gene function and drug action. Especially when performed at large scales, gene-gene or gene-drug interaction screens can reveal complex genetic interactions or drug mechanism of action or even identify novel therapeutics for the treatment of diseases. The result of such large-scale screen results can be represented as a matrix with a numeric score indicating the cellular fitness (e.g. viability or doubling time) for each double perturbation. In a typical screen, the majority of combinations do not impact the cellular fitness. Thus, it is critical to first discern true "hits" from noise. Subsequent data exploration and visualization methods can assist to extract meaningful biological information from the data. However, despite the increasing interest in combination perturbation screens, no user friendly open-source program exists that combines statistical analysis, data exploration tools and visualization. Results We developed TOPS (Tool for Combination Perturbation Screen Analysis), a Java and R-based software tool with a simple graphical user interface that allows the user to import, analyze, filter and plot data from double perturbation screens as well as other compatible data. TOPS was designed in a modular fashion to allow the user to add alternative importers for data formats or custom analysis scripts not covered by the original release. We demonstrate the utility of TOPS on two datasets derived from functional genetic screens using different methods. Dataset 1 is a gene-drug interaction screen and is based on Luminex xMAP technology. Dataset 2 is a gene-gene short hairpin (sh)RNAi screen exploring the interactions between deubiquitinating enzymes and a number of prominent oncogenes using massive parallel sequencing (MPS). Conclusions TOPS provides

  16. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  17. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  18. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    SciTech Connect

    BRATZEL, D.R.

    2000-09-28

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks.

  19. The "X-Ray RheumaCoach" software: a novel tool for enhancing the efficacy and accelerating radiological quantification in rheumatoid arthritis

    PubMed Central

    Wick, M; Peloschek, P; Bogl, K; Graninger, W; Smolen, J; Kainberger, F

    2003-01-01

    Objective: To develop computer assisted quantification software that is particularly applicable to joint scoring in rheumatic disorders. Methods: 3914 radiographs from hands and feet of 190 patients with RA were collected, expertly examined, analysed, and statistically evaluated. Radiographs were quantified using the conventional Larsen score and the "X-Ray RheumaCoach" (XRRC) software. The XRRC is a Java stand alone application which can support and accelerate, but not fully automate, the scoring procedure in RA. The scorer can apply both the Larsen and the Ratingen-Rau scores. Results: Compared with conventional scoring procedures, the XRRC software accelerated quantification time by ~25%. The program, which is now available on the internet free of charge, ran stably and proved to be a consistently valuable tool. Conclusions: Compared with conventional scoring methods, the XRRC software offers several advantages: (a) structured data analysis and input that minimises variance by standardisation; (b) faster and more precise calculation of sum scores and indices; (c) permanent data storing and fast access to the software's database; (d) the possibility of cross calculation to other scores; (e) "user friendly" technology and a dedicated help program; (f) fast access and data transfer through the internet if desired; and (g) reliable documentation of results in a specially designed printout. PMID:12759300

  20. The modified ultrasound pattern sum score mUPSS as additional diagnostic tool for genetically distinct hereditary neuropathies.

    PubMed

    Grimm, Alexander; Rasenack, Maria; Athanasopoulou, Ioanna M; Dammeier, Nele Maria; Lipski, Christina; Wolking, Stefan; Vittore, Debora; Décard, Bernhard F; Axer, Hubertus

    2016-02-01

    The objective of this study is to evaluate the nerve ultrasound characteristics in genetically distinct inherited neuropathies, the value of the modified ultrasound pattern sum score (mUPSS) to differentiate between the subtypes and the correlation of ultrasound with nerve conduction studies (NCS), disease duration and severity. All patients underwent a standardized neurological examination, ultrasound, and NCS. In addition, genetic testing was performed. Consequently, mUPSS was applied, which is a sum-score of cross-sectional areas (CSA) at predefined anatomical points in different nerves. 31 patients were included (10xCharcot-Marie-Tooth (CMT)1a, 3xCMT1b, 3xCMTX, 9xCMT2, 6xHNPP [Hereditary neuropathy with liability to pressure palsies]). Generalized, homogeneous nerve enlargement and significantly increased UPS scores emphasized the diagnosis of demyelinating neuropathy, particularly CMT1a and CMT1b. The amount of enlargement did not depend on disease duration, symptom severity, height and weight. In CMTX the nerves were enlarged, as well, however, only in the roots and lower limbs, most prominent in men. In CMT2 no significant enlargement was detectable. In HNPP the CSA values were increased at entrapped sites, and not elsewhere. However, a distinction from CMT1, which also showed enlarged CSA values at entrapment sites, was only possible by calculating the entrapment ratios and entrapment score. The mUPSS allowed distinction between CMT1a (increased UPS scores, entrapment ratios <1.0) and HNPP (low UPS scores, entrapment ratios >1.4), while CMT1b and CMTX showed intermediate UPS types and entrapment ratios <1.0. Although based on few cases, ultrasound revealed consistent and homogeneous nerve alteration in certain inherited neuropathies. The modified UPSS is a quantitative tool, which may provide useful information for diagnosis, differentiation and follow-up evaluation in addition to NCS and molecular testing. PMID:26559821

  1. Urinary cortisol as an additional tool to assess the welfare of pregnant sows kept in two types of housing.

    PubMed

    Pol, Françoise; Courboulay, Valérie; Cotte, Jean-Pierre; Martrenchar, Arnaud; Hay, Magali; Mormède, Pierre

    2002-01-01

    The use of urinary cortisol (UC) as an additional tool to evaluate sows welfare was assessed in two experiments. In a preliminary methodological experiment, the kinetics of cortisol excretion in urine was studied during an Adreno Cortico Trophic Hormone (ACTH) challenge test in 10 pregnant sows. In a second experiment, 96 primiparous sows of an experimental unit were assigned to two different housing systems: 48 animals were housed in individual pens (IP) and 48 animals in collective pens (CP) with 6 animals per pen. UC was measured at the beginning and at the end of pregnancy and compared with other welfare indicators such as behaviour or skin damage. In both experiments, UC was measured using a high pressure liquid chromatography assay. In experiment 1, UC was constant on the day before injection of ACTH, with no variations related to circadian rhythm. It began to rise 2 h after the injection, peaked between 2 to 5 h after then returned to the basal concentration on the day after the injection. In experiment 2, UC concentrations were not different between CP- and IP-housed sows but they were higher in sows exhibiting the less stereotypies in comparison with sows exhibiting the most stereotypies. The results of this study suggest that UC is a good indicator of acute stress, more convenient than plasma cortisol measurement since it is a non-invasive method avoiding restraint or catheterisation of sows. They also suggest that UC could also give additional information on the assessment of chronic stress and improve the evaluation of animal welfare if used in conjunction with other welfare indicators. PMID:11873815

  2. A Serious Videogame as an Additional Therapy Tool for Training Emotional Regulation and Impulsivity Control in Severe Gambling Disorder

    PubMed Central

    Tárrega, Salomé; Castro-Carreras, Laia; Fernández-Aranda, Fernando; Granero, Roser; Giner-Bartolomé, Cristina; Aymamí, Neus; Gómez-Peña, Mónica; Santamaría, Juan J.; Forcano, Laura; Steward, Trevor; Menchón, José M.; Jiménez-Murcia, Susana

    2015-01-01

    Background: Gambling disorder (GD) is characterized by a significant lack of self-control and is associated with impulsivity-related personality traits. It is also linked to deficits in emotional regulation and frequently co-occurs with anxiety and depression symptoms. There is also evidence that emotional dysregulation may play a mediatory role between GD and psychopathological symptomatology. Few studies have reported the outcomes of psychological interventions that specifically address these underlying processes. Objectives: To assess the utility of the Playmancer platform, a serious video game, as an additional therapy tool in a CBT intervention for GD, and to estimate pre-post changes in measures of impulsivity, anger expression and psychopathological symptomatology. Method: The sample comprised a single group of 16 male treatment-seeking individuals with severe GD diagnosis. Therapy intervention consisted of 16 group weekly CBT sessions and, concurrently, 10 additional weekly sessions of a serious video game. Pre-post treatment scores on South Oaks Gambling Screen (SOGS), Barratt Impulsiveness Scale (BIS-11), I7 Impulsiveness Questionnaire (I7), State-Trait Anger Expression Inventory 2 (STAXI-2), Symptom Checklist-Revised (SCL-90-R), State-Trait Anxiety Inventory (STAI-S-T), and Novelty Seeking from the Temperament and Character Inventory-Revised (TCI-R) were compared. Results: After the intervention, significant changes were observed in several measures of impulsivity, anger expression and other psychopathological symptoms. Dropout and relapse rates during treatment were similar to those described in the literature for CBT. Conclusion: Complementing CBT interventions for GD with a specific therapy approach like a serious video game might be helpful in addressing certain underlying factors which are usually difficult to change, including impulsivity and anger expression. PMID:26617550

  3. Additive technology of soluble mold tooling for embedded devices in composite structures: A study on manufactured tolerances

    NASA Astrophysics Data System (ADS)

    Roy, Madhuparna

    Composite textiles have found widespread use and advantages in various industries and applications. The constant demand for high quality products and services requires companies to minimize their manufacturing costs, and delivery time in order to compete in general and niche marketplaces. Advanced manufacturing methods aim to provide economical methods of mold production. Creation of molding and tooling options for advanced composites encompasses a large portion of the fabrication time, making it a costly process and restraining factor. This research discusses a preliminary investigation into the use of soluble polymer compounds and additive manufacturing to fabricate soluble molds. These molds suffer from dimensional errors due to several factors, which have also been characterized. The basic soluble mold of a composite is 3D printed to meet the desired dimensions and geometry of holistic structures or spliced components. The time taken to dissolve the mold depends on the rate of agitation of the solvent. This process is steered towards enabling the implantation of optoelectronic devices within the composite to provide sensing capability for structural health monitoring. The shape deviation of the 3D printed mold is also studied and compared to its original dimensions to optimize the dimensional quality to produce dimensionally accurate parts. Mechanical tests were performed on compact tension (CT) resin samples prepared from these 3D printed molds and revealed crack propagation towards an embedded intact optical fiber.

  4. Demonstration of the Recent Additions in Modeling Capabilities for the WEC-Sim Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-03-01

    WEC-Sim is a mid-fidelity numerical tool for modeling wave energy conversion (WEC) devices. The code uses the MATLAB SimMechanics package to solve the multi-body dynamics and models the wave interactions using hydrodynamic coefficients derived from frequency domain boundary element methods. In this paper, the new modeling features introduced in the latest release of WEC-Sim will be presented. The first feature discussed is the conversion of the fluid memory kernel to a state-space approximation that provides significant gains in computational speed. The benefit of the state-space calculation becomes even greater after the hydrodynamic body-to-body coefficients are introduced as the number of interactions increases exponentially with the number of floating bodies. The final feature discussed is the capability toadd Morison elements to provide additional hydrodynamic damping and inertia. This is generally used as a tuning feature, because performance is highly dependent on the chosen coefficients. In this paper, a review of the hydrodynamic theory for each of the features is provided and successful implementation is verified using test cases.

  5. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  6. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  7. The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2009-01-01

    This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.

  8. Design and Multicentric Implementation of a Generic Software Architecture for Patient Recruitment Systems Re-Using Existing HIS Tools and Routine Patient Data

    PubMed Central

    Trinczek, B.; Köpcke, F.; Leusch, T.; Majeed, R.W.; Schreiweis, B.; Wenk, J.; Bergh, B.; Ohmann, C.; Röhrig, R.; Prokosch, H.U.; Dugas, M.

    2014-01-01

    Summary Objective (1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Methods Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user’s request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. Results 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. Conclusion A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to

  9. TOWARD DEVELOPMENT OF A COMMON SOFTWARE APPLICATION PROGRAMMING INTERFACE (API) FOR UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION METHODS AND TOOLS

    EPA Science Inventory

    The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...

  10. Integrating Commercially-Available Educational Software into a Learning Environment with the QuiltSpace Builder Tool.

    ERIC Educational Resources Information Center

    Williamson, Mary

    The QuiltSpace Builder enables Microsoft's multimedia encyclopedia "Encarta" to "fit" into an established home or institutional learning environment so that "Encarta" can be used for productive research. Early observations of the difficulties encountered by Encarta users, coupled with a survey of presently available commercial software products,…

  11. Software distribution using xnetlib

    SciTech Connect

    Dongarra, J.J. |; Rowan, T.H.; Wade, R.C.

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  12. The Perfect Neuroimaging-Genetics-Computation Storm: Collision of Petabytes of Data, Millions of Hardware Devices and Thousands of Software Tools

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M.; Van Horn, John D.; Toga, Arthur W.

    2013-01-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data. PMID:23975276

  13. Verifying nuclear fuel assemblies in wet storages on a partial defect level: A software simulation tool for evaluating the capabilities of the Digital Cherenkov Viewing Device

    NASA Astrophysics Data System (ADS)

    Grape, Sophie; Jacobsson Svärd, Staffan; Lindberg, Bo

    2013-01-01

    The Digital Cherenkov Viewing Device (DCVD) is an instrument that records the Cherenkov light emitted from irradiated nuclear fuels in wet storages. The presence, intensity and pattern of the Cherenkov light can be used by the International Atomic Energy Agency (IAEA) inspectors to verify that the fuel properties comply with declarations. The DCVD is since several years approved by the IAEA for gross defect verification, i.e. to control whether an item in a storage pool is a nuclear fuel assembly or a non-fuel item [1]. Recently, it has also been endorsed as a tool for partial defect verification, i.e. to identify if a fraction of the fuel rods in an assembly have been removed or replaced. The latter recognition was based on investigations of experimental studies on authentic fuel assemblies and of simulation studies on hypothetic cases of partial defects [2]. This paper describes the simulation methodology and software which was used in the partial defect capability evaluations. The developed simulation procedure uses three stand-alone software packages: the ORIGEN-ARP code [3] used to obtain the gamma-ray spectrum from the fission products in the fuel, the Monte Carlo toolkit Geant4 [4] for simulating the gamma-ray transport in and around the fuel and the emission of Cherenkov light, and the ray-tracing programme Zemax [5] used to model the light transport through the assembly geometry to the DCVD and to mimic the behaviour of its lens system. Furthermore, the software allows for detailed information from the plant operator on power and/or burnup distributions to be taken into account to enhance the authenticity of the simulated images. To demonstrate the results of the combined software packages, simulated and measured DCVD images are presented. A short discussion on the usefulness of the simulation tool is also included.

  14. Software tools that facilitate kinetic modelling with large data sets: an example using growth modelling in sugarcane.

    PubMed

    Uys, L; Hofmeyr, J H S; Snoep, J L; Rohwer, J M

    2006-09-01

    A solution to manage cumbersome data sets associated with large modelling projects is described. A kinetic model of sucrose accumulation in sugarcane is used to predict changes in sucrose metabolism with sugarcane internode maturity. This results in large amounts of output data to be analysed. Growth is simulated by reassigning maximal activity values, specific to each internode of the sugarcane plant, to parameter attributes of a model object. From a programming perspective, only one model definition file is required for the simulation software used; however, the amount of input data increases with each extra interrnode that is modelled, and likewise the amount of output data that is generated also increases. To store, manipulate and analyse these data, the modelling was performed from within a spreadsheet. This was made possible by the scripting language Python and the modelling software PySCeS through an embedded Python interpreter available in the Gnumeric spreadsheet program. PMID:16986323

  15. PhasePlot: An Interactive Software Tool for Visualizing Phase Relations, Performing Virtual Experiments, and for Teaching Thermodynamic Concepts in Petrology

    NASA Astrophysics Data System (ADS)

    Ghiorso, M. S.

    2012-12-01

    The computer program PhasePlot was developed for Macintosh computers and released via the Mac App Store in December 2011. It permits the visualization of phase relations calculated from internally consistent thermodynamic data-model collections, including those from MELTS (Ghiorso and Sack, 1995, CMP 119, 197-212), pMELTS (Ghiorso et al., 2002, G-cubed 3, 10.1029/2001GC000217) and the deep mantle database of Stixrude and Lithgow-Bertelloni (2011, GJI 184, 1180-1213). The software allows users to enter a system bulk composition and a range of reference conditions, and then calculate a grid of phase relations. These relations may be visualized in a variety of ways including pseudosections, phase diagrams, phase proportion plots, and contour diagrams of phase compositions and abundances. The program interface is user friendly and the computations are fast on laptop-scale machines, which makes PhasePlot amenable to in-class demonstrations, as a tool in instructional laboratories, and as an aid in support of out-of-class exercises and research. Users focus on problem specification and interpretation of results rather than on manipulation and mechanics of computation. The software has been developed with NSF support and is free. The PhasePlot web site is at phaseplot.org where extensive user documentation, video tutorials and examples of use may be found. The original release of phase plot permitted calculations to be performed on pressure-, temperature-grids (P-T), by direct minimization of the Gibbs free energy of the system at each grid point. A revision of PhasePlot (scheduled for release to the Mac App Store in December 2012) extends capabilities to include pressure-, entropy-grids (P-S) by system enthalpy minimization, volume-, temperature-grids (V-T) by system Helmholtz energy minimization, and volume-,entropy-grids (V-S) by minimization of the Internal Energy of the system. P-S gridded results may be utilized to visualize phase relations as a function of heat

  16. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGESBeta

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; Sosonkina, Masha; Windus, Theresa L.

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  17. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)

  18. Software Reviews.

    ERIC Educational Resources Information Center

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  19. Control Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Real-Time Innovations, Inc. (RTI) collaborated with Ames Research Center, the Jet Propulsion Laboratory and Stanford University to leverage NASA research to produce ControlShell software. RTI is the first "graduate" of Ames Research Center's Technology Commercialization Center. The ControlShell system was used extensively on a cooperative project to enhance the capabilities of a Russian-built Marsokhod rover being evaluated for eventual flight to Mars. RTI's ControlShell is complex, real-time command and control software, capable of processing information and controlling mechanical devices. One ControlShell tool is StethoScope. As a real-time data collection and display tool, StethoScope allows a user to see how a program is running without changing its execution. RTI has successfully applied its software savvy in other arenas, such as telecommunications, networking, video editing, semiconductor manufacturing, automobile systems, and medical imaging.

  20. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics