Science.gov

Sample records for evaluation software tool

  1. SUSTAINABLE REMEDIATION SOFTWARE TOOL EXERCISE AND EVALUATION

    SciTech Connect

    Kohn, J.; Nichols, R.; Looney, B.

    2011-05-12

    The goal of this study was to examine two different software tools designed to account for the environmental impacts of remediation projects. Three case studies from the Savannah River Site (SRS) near Aiken, SC were used to exercise SiteWise (SW) and Sustainable Remediation Tool (SRT) by including both traditional and novel remediation techniques, contaminants, and contaminated media. This study combined retrospective analysis of implemented projects with prospective analysis of options that were not implemented. Input data were derived from engineering plans, project reports, and planning documents with a few factors supplied from calculations based on Life Cycle Assessment (LCA). Conclusions drawn from software output were generally consistent within a tool; both tools identified the same remediation options as the 'best' for a given site. Magnitudes of impacts varied between the two tools, and it was not always possible to identify the source of the disagreement. The tools differed in their quantitative approaches: SRT based impacts on specific contaminants, media, and site geometry and modeled contaminant removal. SW based impacts on processes and equipment instead of chemical modeling. While SW was able to handle greater variety in remediation scenarios, it did not include a measure of the effectiveness of the scenario.

  2. An Evaluation Format for "Open" Software Tools.

    ERIC Educational Resources Information Center

    Murphy, Cheryl A.

    1995-01-01

    Evaluates six "open" (empty of content and customized by users) software programs using the literature-based characteristics of documentation, learner control, branching capabilities, portability, ease of use, and cost-effectiveness. Interviewed computer-knowledgeable individuals to confirm the legitimacy of the evaluative characteristics. (LRW)

  3. fMRI analysis software tools: an evaluation framework

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Colli, Vittoria; Strocchi, Sabina; Vite, Cristina; Binaghi, Elisabetta; Conte, Leopoldo

    2011-03-01

    Performance comparison of functional Magnetic Resonance Imaging (fMRI) software tools is a very difficult task. In this paper, a framework for comparison of fMRI analysis results obtained with different software packages is proposed. An objective evaluation is possible only after pre-processing steps that normalize input data in a standard domain. Segmentation and registration algorithms are implemented in order to classify voxels belonging to brain or not, and to find the non rigid transformation that best aligns the volume under inspection with a standard one. Through the definitions of intersection and union of fuzzy logic an index was defined which quantify information overlap between Statistical Parametrical Maps (SPMs). Direct comparison between fMRI results can only highlight differences. In order to assess the best result, an index that represents the goodness of the activation detection is required. The transformation of the activation map in a standard domain allows the use of a functional Atlas for labeling the active voxels. For each functional area the Activation Weighted Index (AWI) that identifies the mean activation level of whole area was defined. By means of this brief, but comprehensive description, it is easy to find a metric for the objective evaluation of a fMRI analysis tools. Trough the first evaluation method the situations where the SPMs are inconsistent were identified. The result of AWI analysis suggest which tool has higher sensitivity and specificity. The proposed method seems a valid evaluation tool when applied to an adequate number of patients.

  4. Evaluation of free non-diagnostic DICOM software tools

    NASA Astrophysics Data System (ADS)

    Liao, Wei; Deserno, Thomas M.; Spitzer, Klaus

    2008-03-01

    A variety of software exists to interpret files or directories compliant to the Digital Imaging and Communications in Medicine (DICOM) standard and display them as individual images or volume rendered objects. Some of them offer further processing and analysis features. The surveys that have been published so far are partly not up-to-date anymore, and neither a detailed description of the software functions nor a comprehensive comparison is given. This paper aims at evaluation and comparison of freely available, non-diagnostic DICOM software with respect to the following aspects: (i) data import; (ii) data export; (iii) header viewing; (iv) 2D image viewing; (v) 3D volume viewing; (vi) support; (vii) portability; (viii) workability; and (ix) usability. In total, 21 tools were included: 3D Slicer, AMIDE, BioImage Suite, DicomWorks, EViewBox, ezDICOM, FPImage, ImageJ, JiveX, Julius, MedImaView, MedINRIA, MicroView, MIPAV, MRIcron, Osiris, PMSDView, Syngo FastView, TomoVision, UniViewer, and XMedCon. Our results in table form can ease the selection of appropriate DICOM software tools. In particular, we discuss use cases for the inexperienced user, data conversion, and volume rendering, and suggest Syngo FastView or PMSDView, DicomWorks or XMedCon, and ImageJ or UniViewer, respectively.

  5. Evaluation of Software Tools for Segmentation of Temporal Bone Anatomy.

    PubMed

    Hassan, Kowther; Dort, Joseph C; Sutherland, Garnette R; Chan, Sonny

    2016-01-01

    Surgeons are increasingly relying on 3D medical image data for planning interventions. Virtual 3D models of intricate anatomy, such as that found within the temporal bone, have proven useful for surgical education, planning, and rehearsal, but such applications require segmentation of surgically relevant structures in the image data. Four publicly available software packages, ITK-SNAP, MITK, 3D Slicer, and Seg3D, were evaluated for their efficacy in segmenting temporal bone anatomy from CT and MR images to support patient-specific surgery simulation. No single application provided efficient means to segment every structure, but a combination of the tools evaluated enables creation of a complete virtual temporal bone model from raw image data with reasonably minimal effort.

  6. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  7. Survivability as a Tool for Evaluating Open Source Software

    DTIC Science & Technology

    2015-06-01

    combat systems face potential threats from cyber warfare professionals aiming to manipulate software embedded in the systems. The research highlights...current Department of Defense (DOD) interest in OSS, and explains a method for evaluating the capability of OSS programs to withstand cyber warfare attacks

  8. Software Tools for Software Maintenance

    DTIC Science & Technology

    1988-10-01

    Amisteant" project was commmioned to study the problems of software maintenance and to investigate the concept of bringing together a combinationof loosely...integrated tools that could improve the productivity o maintenance programmers and increase the reliability of modified programs. One area of study has...that can aid in understanding the program and modifying it. Background work for study in this area included in examination of existing software tools

  9. Software Tool Issues

    NASA Astrophysics Data System (ADS)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  10. Methods and software tools for design evaluation in population pharmacokinetics-pharmacodynamics studies.

    PubMed

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)-pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization.

  11. Methods and software tools for design evaluation in population pharmacokinetics–pharmacodynamics studies

    PubMed Central

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)–pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization. PMID:24548174

  12. Software tool for polarimeter design and evaluation and polarimetric data reduction

    NASA Astrophysics Data System (ADS)

    Engel, John R.; Tapia, Santiago

    1990-10-01

    We are developing an interactive computer program which includes many of the possible options available for the design of a polarimeter, for the analysis of the output signal, and for the computation of the polarization of the source under investigation. The software is being implemented on a microcomputer and is operated interactively. It utilizes pull-down menus, dialog boxes, graphics and mouse-driven point and click techniques to ease user interaction. We have addressed the issue of how to represent the polarimetric model in software by manipulating optical element transformation matrices (Mueller matrices) to produce the resultant polarimeter matrix. To keep the polarimetric model general and to facilitate analytic evaluations of polarimeter designs we represent these matrices in symbolic form in the software. The software tool will have wide ranging applications in areas of engineering, research and education dealing with polarimetry. It can be used as a design tool for polarimeters in laboratories, remote sensing, or both ground- and space-based astronomy. It can also be used as a simulation tool for polarimetric measurements which will be useful for evaluating polarimeter designs or for educational purposes. The software contains a data reduction tool which can be used to evaluate intensity measurements made by a specified polarimeter.

  13. Dynamic Susceptibility Contrast-MRI Quantification Software Tool: Development and Evaluation

    PubMed Central

    Korfiatis, Panagiotis; Kline, Timothy L.; Kelm, Zachary S.; Carter, Rickey E.; Hu, Leland S.; Erickson, Bradley J.

    2016-01-01

    Relative cerebral blood volume (rCBV) is a magnetic resonance imaging biomarker that is used to differentiate progression from pseudoprogression in patients with glioblastoma multiforme, the most common primary brain tumor. However, calculated rCBV depends considerably on the software used. Automating all steps required for rCBV calculation is important, as user interaction can lead to increased variability and possible inaccuracies in clinical decision-making. Here, we present an automated tool for computing rCBV from dynamic susceptibility contrast-magnetic resonance imaging that includes leakage correction. The entrance and exit bolus time points are automatically calculated using wavelet-based detection. The proposed tool is compared with 3 Food and Drug Administration-approved software packages, 1 automatic and 2 requiring user interaction, on a data set of 43 patients. We also evaluate manual and automated white matter (WM) selection for normalization of the cerebral blood volume maps. Our system showed good agreement with 2 of the 3 software packages. The intraclass correlation coefficient for all comparisons between the same software operated by different people was >0.880, except for FuncTool when operated by user 1 versus user 2. Little variability in agreement between software tools was observed when using different WM selection techniques. Our algorithm for automatic rCBV calculation with leakage correction and automated WM selection agrees well with 2 out of the 3 FDA-approved software packages. PMID:28066810

  14. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  15. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  16. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  17. Software Quality Tools

    DTIC Science & Technology

    1988-05-04

    data base name mate qa tool - tare and lcsc 1 * no. instruments * $ ftim * instrument name * sensor * system designator * 1 * no. nouns* ac signal...PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (if applicable) Fk ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM...PROJECT TASK WORK UNIT ELEMENT NO. NO. NO ACCESSION NO. 11. TITLE (Include Security Classification) Software Quality Tools 12. PERSONAL AUTHOR(S

  18. The SRS-Viewer: A Software Tool for Displaying and Evaluation of Pyroshock Data

    NASA Astrophysics Data System (ADS)

    Eberl, Stefan

    2014-06-01

    For the evaluation of the success of a pyroshock, the time domain and the corresponding Shock-Response- Spectra (SRS) have to be considered. The SRS-Viewer is an IABG developed software tool [1] to read data in Universal File format (*.unv) and either display or plot for each accelerometer the time domain, corresponding SRS and the specified Reference-SRS with tolerances in the background.The software calculates the "Average (AVG)", "Maximum (MAX)" and "Minimum (MIN)" SRS of any selection of accelerometers. A statistical analysis calculates the percentages of measured SRS above the specified Reference-SRS level and the percentage within the tolerance bands for comparison with the specified success criteria.Overlay plots of single accelerometers of different test runs enable to monitor the repeatability of the shock input and the integrity of the specimen. Furthermore the difference between the shock on a mass-dummy and the real test unit can be examined.

  19. User Interface Software Tools

    DTIC Science & Technology

    1994-08-01

    97. 19. Mark A. Flecchia and R. Daniel Bergeron. Specifying Complex Dialogs in ALGAE. Human Factors in Computing Systems, CHI+GI󈨛, Toronto, Ont...Spreadsheet Model. Tech. Rept. GIT-GVU-93-20, Georgia Tech Graphics, Visualization and Usability Center, May, 1993. 35. Daniel H.H. Ingalls. "I’he Smalltalk...Interactive Graphical Applications". Comm. ACM 36,4 (April 1993), 41-55. User Interface Software Tools -39 38. Anthony Karrer and Walt Scacchi . Requirements

  20. Application of a Software tool for Evaluating Human Factors in Accident Sequences

    SciTech Connect

    Queral, Cesar; Exposito, Antonio; Gonzalez, Isaac

    2006-07-01

    The Probabilistic Safety Assessment (PSA) includes the actions of the operator like elements in the set of the considered protection performances during accident sequences. Nevertheless, its impact throughout a sequence is not analyzed in a dynamic way. In this sense, it is convenient to make more detailed studies about its importance in the dynamics of the sequences, letting make studies of sensitivity respect to the human reliability and the response times. For this reason, the CSN is involved in several activities oriented to develop a new safety analysis methodology, the Integrated Safety Assessment (ISA), which must be able to incorporate operator actions in conventional thermo-hydraulic (TH) simulations. One of them is the collaboration project between CSN, HRP and the DSE-UPM that started in 2003. In the framework of this project, a software tool has been developed to incorporate operator actions in TH simulations. As a part of the ISA, this tool permits to quantify human error probabilities (HEP) and to evaluate its impact in the final state of the plant. Independently, it can be used for evaluating the impact of the execution by operators of procedures and guidelines in the final state of the plant and the evaluation of the allowable response times for the manual actions of the operator. The results obtained in the first pilot case are included in this paper. (authors)

  1. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    PubMed

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization.

  2. CSAM Metrology Software Tool

    NASA Technical Reports Server (NTRS)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  3. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  4. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this

  5. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    PubMed

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture.

  6. Biological imaging software tools.

    PubMed

    Eliceiri, Kevin W; Berthold, Michael R; Goldberg, Ilya G; Ibáñez, Luis; Manjunath, B S; Martone, Maryann E; Murphy, Robert F; Peng, Hanchuan; Plant, Anne L; Roysam, Badrinath; Stuurman, Nico; Stuurmann, Nico; Swedlow, Jason R; Tomancak, Pavel; Carpenter, Anne E

    2012-06-28

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the inherent challenges and the overall status of available software for bioimage informatics, focusing on open-source options.

  7. Biological Imaging Software Tools

    PubMed Central

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  8. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  9. Development of a software tool and criteria evaluation for efficient design of small interfering RNA

    SciTech Connect

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-07

    Research highlights: {yields} The developed tool predicted siRNA constructs with better thermodynamic stability and total score based on positional and other criteria. {yields} Off-target silencing below score 30 were observed for the best siRNA constructs for different genes. {yields} Immunostimulation and cytotoxicity motifs considered and penalized in the developed tool. {yields} Both positional and compositional criteria were observed to be important. -- Abstract: RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA.

  10. Landscape analysis software tools

    Treesearch

    Don Vandendriesche

    2008-01-01

    Recently, several new computer programs have been developed to assist in landscape analysis. The “Sequential Processing Routine for Arraying Yields” (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...

  11. Software Tools: EPICUR.

    ERIC Educational Resources Information Center

    Abreu, Jose Luis; And Others

    EPICUR (Integrated Programing Environment for the Development of Educational Software) is a set of programming modules ranging from low level interfaces to high level algorithms aimed at the development of computer-assisted instruction (CAI) applications. The emphasis is on user-friendly interfaces and on multiplying productivity without loss of…

  12. Modern Tools for Modern Software

    SciTech Connect

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  13. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  14. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  15. A software tool to evaluate crystal types and morphological developments of accessory zircon

    NASA Astrophysics Data System (ADS)

    Sturm, Robert

    2014-08-01

    Computer programs for an appropriate visualization of crystal types and morphological developments of accessory zircon are not available hitherto. Usually, typological computations are conducted by using simple calculation tools or spread-sheet programs. In practice, however, high numbers of data sets including information of numerous zircon populations have to be processed and stored. The paper describes the software ZIRCTYP, which is a macro-driven program within the Microsoft Access database management system. It allows the computation of zircon morphologies occurring in specific rock samples and their presentation in typology diagrams. In addition, morphological developments within a given zircon population are presented (1) statistically and (2) graphically as crystal sequences showing initial, intermediate, and final growth stages.

  16. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    SciTech Connect

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  17. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    NASA Astrophysics Data System (ADS)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  18. Establishing a Methodology for Evaluation and Selecting Computer Aided Software Engineering Tools for a Defined Software Engineering Environment at the Air Force Institute of Technology School of Engineering

    DTIC Science & Technology

    1991-12-01

    F. Lecouat, and V. Ambriola. "A Tool to Coordinate Tools," IEEE Software: 17-25 (November 1988). 6. Bruce , T. A., J. Fuller, and T. Moriarty, "So You...34 Journal of Systems Management, 40-5: 29-32 (May 1989). BIB.1 14. Dart, S. A., R. J. Ellison, P. H. Feiler , and A. N. Habermann, "Software

  19. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    NASA Astrophysics Data System (ADS)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  20. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    PubMed

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. ASSET: a software tool for the evaluation of manoeuvre capabilities of highly agile satellites

    NASA Astrophysics Data System (ADS)

    Barschke, Merlin F.; Levenhagen, Jens; Reggio, Domenico; Roberts, Peter C. E.

    2014-03-01

    The new generation of agile earth observation satellites provides much higher observation capabilities than their non-agile predecessors. From a kinematic point of view, these capabilities result in more complex guidance laws for the spacecraft's attitude control system. The computation of these guidance laws is driven by a number of factors. For instance, the Earth's curved shape and its rotation in combination with the possible scan path geometries lead to a highly nonlinear relation between the motion of the satellite and the line-of-sight projection onto Earth. In this paper ASSET (Agile Satellites Scenario Evaluation Tool) is presented. ASSET is a modular MATLAB command line tool developed at Astrium GmbH, Germany, to asses the manoeuvre capabilities of agile satellites carrying time-delayed integration instruments. Each single scenario may consist of one or several ground scans, linked by suitable spacecraft slews. Once the entire scenario is defined, ASSET will analyse whether the kinematic and dynamic constraints of a specific satellite allow this scenario to be performed and will then generate the related guidance profile (angles and angular rates). The satellites' ground track, the projection of the instruments line-of-sight, and the projection of the instruments field of view onto the earth can be plotted for a visual inspection. ASSET can perform the analysis of scenarios with several different scan modes usually performed by this type of satellite.

  2. Evaluation of two software tools dedicated to an automatic analysis of the CT scanner image spatial resolution.

    PubMed

    Torfeh, Tarraf; Beaumont, Stéphane; Guédon, Jean Pierre; Denis, Eloïse

    2007-01-01

    An evaluation of two software tools dedicated to an automatic analysis of the CT scanner image spatial resolution is presented in this paper. The methods evaluated consist of calculating the Modulation Transfer Function (MTF) of the CT scanners; the first uses an image of an impulse source, while the second method proposed by Droege and Morin uses an image of cyclic bar patterns. Two Digital Test Objects (DTO) are created to this purpose. These DTOs are then blurred by doing a convolution with a two-dimensional Gaussian Point Spread Function (PSF(Ref)), which has a well known Full Width at Half Maximum (FWHM). The evaluation process consists then of comparing the Fourier transform of the PSF on the one hand, and the two mentioned methods on the other hand.

  3. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis

    PubMed Central

    Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages (‘Edgewarp’ and ‘Morpho’) for the same sliding task, and investigate potential differences in the results and biological interpretation. ‘Morpho’ is much faster than ‘Edgewarp,’ notably as a result of the greater computational power of the ‘Morpho’ software routines and the complexity of the ‘Edgewarp’ workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  4. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis.

    PubMed

    Botton-Divet, Léo; Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages ('Edgewarp' and 'Morpho') for the same sliding task, and investigate potential differences in the results and biological interpretation. 'Morpho' is much faster than 'Edgewarp,' notably as a result of the greater computational power of the 'Morpho' software routines and the complexity of the 'Edgewarp' workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses.

  5. The Design and Evaluation of Class Exercises as Active Learning Tools in Software Verification and Validation

    ERIC Educational Resources Information Center

    Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil

    2016-01-01

    It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…

  6. Evaluating Difficulty Levels of Dynamic Geometry Software Tools to Enhance Teachers' Professional Development

    ERIC Educational Resources Information Center

    Hohenwarter, Judith; Hohenwarter, Markus; Lavicza, Zsolt

    2010-01-01

    This paper describes a study aimed to identify commonly emerging impediments related to the introduction of dynamic mathematics software. We report on the analysis of data collected during a three-week professional development programme organised for middle and high school teachers in Florida. The study identified challenges that participants face…

  7. Modification, Implementation, and Evaluation of a Remote Terminal Emulator as a Software Validation and Stress Testing Tool.

    DTIC Science & Technology

    1987-12-01

    Dawn, who understood why her dad cuuld not attend all of her athletic events. Craig J. Riesberg ’-% r % ii e% Table of Contents Page Preface... management . The two tools are also compared during the emulation phase of software validation. The RTE package was also examined as a stress testing tool...terminal emulator is the continuation of an effort in software and hardware configuration management which began at the Military Airlift Command in

  8. Component Modeling Approach Software Tool

    SciTech Connect

    2010-08-23

    The Component Modeling Approach Software Tool (CMAST) establishes a set of performance libraries of approved components (frames, glass, and spacer) which can be accessed for configuring fenestration products for a project, and btaining a U-factor, Solar Heat Gain Coefficient (SHGC), and Visible Transmittance (VT) rating for those products, which can then be reflected in a CMA Label Certificate for code compliance. CMAST is web-based as well as client-based. The completed CMA program and software tool will be useful in several ways for a vast array of stakeholders in the industry: Generating performance ratings for bidding projects Ascertaining credible and accurate performance data Obtaining third party certification of overall product performance for code compliance

  9. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    NASA Astrophysics Data System (ADS)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  10. STE - The Software Tools Editor

    NASA Astrophysics Data System (ADS)

    Software tools is an excellent book written by B. W. Kernighan and P. J. Plauger, published by Addison-Wesley. In it the authors discuss how to write programs that make good tools, and how to program well in the process. One of the tools they develop is a fairly powerful editor, written in Ratfor (a structured form of FORTRAN IV). This program has been implemented on the UCL Starlink VAX (with a few modifications and extensions) and is recommended as the editor to use on the VAX. This note gives a brief introduction to, and description of, the editor which has been abstracted from the book (which you are recommended to buy). There are some short command summary sections at the end of this note. After reading this note you may like to print these short files and use them for reference when using the editor.

  11. A Taxonomy of Knowledge Management Software Tools: Origins and Applications.

    ERIC Educational Resources Information Center

    Tyndale, Peter

    2002-01-01

    Examines, evaluates, and organizes a wide variety of knowledge management software tools by examining the literature related to the selection and evaluation of knowledge management tools. (Author/SLD)

  12. Evaluation of Visualization Software

    NASA Technical Reports Server (NTRS)

    Globus, Al; Uselton, Sam

    1995-01-01

    Visualization software is widely used in scientific and engineering research. But computed visualizations can be very misleading, and the errors are easy to miss. We feel that the software producing the visualizations must be thoroughly evaluated and the evaluation process as well as the results must be made available. Testing and evaluation of visualization software is not a trivial problem. Several methods used in testing other software are helpful, but these methods are (apparently) often not used. When they are used, the description and results are generally not available to the end user. Additional evaluation methods specific to visualization must also be developed. We present several useful approaches to evaluation, ranging from numerical analysis of mathematical portions of algorithms to measurement of human performance while using visualization systems. Along with this brief survey, we present arguments for the importance of evaluations and discussions of appropriate use of some methods.

  13. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    SciTech Connect

    Not Available

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  14. Early clinical evaluation of a novel three-dimensional structure delineation software tool (SCULPTER) for radiotherapy treatment planning.

    PubMed

    McBain, C A; Moore, C J; Green, M M L; Price, G; Sykes, J S; Amer, A; Khoo, V S; Price, P

    2008-08-01

    Modern radiotherapy treatment planning (RTP) necessitates increased delineation of target volumes and organs at risk. Conventional manual delineation is a laborious, time-consuming and subjective process. It is prone to inconsistency and variability, but has the potential to be improved using automated segmentation algorithms. We carried out a pilot clinical evaluation of SCULPTER (Structure Creation Using Limited Point Topology Evidence in Radiotherapy) - a novel prototype software tool designed to improve structure delineation for RTP. Anonymized MR and CT image datasets from patients who underwent radiotherapy for bladder or prostate cancer were studied. An experienced radiation oncologist used manual and SCULPTER-assisted methods to create clinically acceptable organ delineations. SCULPTER was also tested by four other RTP professionals. Resulting contours were compared by qualitative inspection and quantitatively by using the volumes of the structures delineated and the time taken for completion. The SCULPTER tool was easy to apply to both MR and CT images and diverse anatomical sites. SCULPTER delineations closely reproduced manual contours with no significant volume differences detected, but SCULPTER delineations were significantly quicker (p<0.05) in most cases. In conclusion, clinical application of SCULPTER resulted in rapid and simple organ delineations with equivalent accuracy to manual methods, demonstrating proof-of-principle of the SCULPTER system and supporting its potential utility in RTP.

  15. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  16. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  17. DIsulfide Mapping PLanner Software Tool.

    PubMed

    Kist, Andreas M; Lampert, Angelika; O'Reilly, Andrias O

    2017-08-17

    Disulfide bridges are side-chain-mediated covalent bonds between cysteines that stabilize many protein structures. Disulfide mapping experiments to resolve these linkages typically involve proteolytic cleavage of the protein of interest followed by mass spectroscopy to identify fragments corresponding to linked peptides. Here we report the sequence-based "DIMPL" web tool to facilitate the planning and analysis steps of experimental mapping studies. The software tests permutations of user-selected proteases to determine an optimal peptic digest that produces cleavage between cysteine residues, thus separating each to an individual peptide fragment. The webserver returns fragment sequence and mass data that can be dynamically ordered to enable straightforward comparative analysis with mass spectroscopy results, facilitating dipeptide identification.

  18. A software technology evaluation program

    NASA Technical Reports Server (NTRS)

    Novaes-Card, David N.

    1985-01-01

    A set of quantitative approaches is presented for evaluating software development methods and tools. The basic idea is to generate a set of goals which are refined into quantifiable questions which specify metrics to be collected on the software development and maintenance process and product. These metrics can be used to characterize, evaluate, predict, and motivate. They can be used in an active as well as passive way by learning form analyzing the data and improving the methods and tools based upon what is learned from that analysis. Several examples were given representing each of the different approaches to evaluation. The cost of the approaches varied inversely with the level of confidence in the interpretation of the results.

  19. Evaluation of high-performance computing software

    SciTech Connect

    Browne, S.; Dongarra, J.; Rowan, T.

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  20. A software tool for ecosystem services assessments

    NASA Astrophysics Data System (ADS)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    The EU FP7 DESSIN project is developing methods and tools for assessment of ecosystem services (ESS) and associated economic values, with a focus on freshwater ESS in urban settings. Although the ESS approach has gained considerable visibility over the past ten years, operationalizing the approach remains a challenge. Therefore, DESSSIN is also supporting development of a free software tool to support users implementing the DESSIN ESS evaluation framework. The DESSIN ESS evaluation framework is a structured approach to measuring changes in ecosystem services. The main purpose of the framework is to facilitate the application of the ESS approach in the appraisal of projects that have impacts on freshwater ecosystems and their services. The DESSIN framework helps users evaluate changes in ESS by linking biophysical, economic, and sustainability assessments sequentially. It was developed using the Common International Classification of Ecosystem Services (CICES) and the DPSIR (Drivers, Pressures, States, Impacts, Responses) adaptive management cycle. The former is a standardized system for the classification of ESS developed by the European Union to enhance the consistency and comparability of ESS assessments. The latter is a well-known concept to disentangle the biophysical and social aspects of a system under study. As part of its analytical component, the DESSIN framework also integrates elements of the Final Ecosystem Goods and Services-Classification System (FEGS-CS) of the US Environmental Protection Agency (USEPA). As implemented in the software tool, the DESSIN framework consists of five parts: • In part I of the evaluation, the ecosystem is defined and described and the local stakeholders are identified. In addition, administrative details and objectives of the assessment are defined. • In part II, drivers and pressures are identified. Once these first two elements of the DPSIR scheme have been characterized, the claimed/expected capabilities of a

  1. Evaluation of educational software.

    PubMed

    Schleyer, Titus K L; Johnson, Lynn A

    2003-11-01

    Evaluation is an important component of developing educational software. Ideally, such evaluation quantifies and qualifies the effects of a new educational intervention on the learning process and outcomes. Conducting meaningful and rigorous educational evaluation is difficult, however. Challenges include defining and measuring educational outcomes, accounting for media effects, coping with practical problems in designing studies, and asking the right research questions. Practical considerations that make the design of evaluation studies difficult include confounding, potentially small effect sizes, contamination effects, and ethics. Two distinct approaches to evaluation are objectivist and subjectivist. These two complement each other in describing the whole range of effects a new educational program can have. Objectivist demonstration studies should be preceded by measurement studies that assess the reliability and validity of the evaluation instrument(s) used. Many evaluation studies compare the performance of learners who are exposed to either the new program or a more traditional approach. However, this method is problematic because test or exam performance is often a weak indicator of competence and may fail to capture important nuances in outcomes. Subjectivist studies are more qualitative in nature and may provide insights complementary to those gained with objectivist studies. Several published examples are used in this article to illustrate different evaluation methods. Readers are encouraged to contemplate a wide range of evaluation study designs and explore increasingly complex questions when evaluating educational software.

  2. Laboratory evaluation of the Acapella device: pressure characteristics under different conditions, and a software tool to optimize its practical use.

    PubMed

    Alves Silva, Carlos Eduardo; Santos, Josiel G; Jansen, José M; de Melo, Pedro Lopes

    2009-11-01

    The Acapella is a respiratory rehabilitation device designed to aid sputum clearance. When the patient exhales through this device, continuous and oscillatory pressure levels are produced. The adequate practical use of the Acapella is critically dependent on the characteristics of the produced pressure, which include the production of a mean pressure>or=10 cm H2O and a matching of the oscillation frequency with the respiratory-system resonance frequency, and/or with the frequency of ciliary movement (approximately 13 Hz). The development of a dedicated software tool would contribute to optimize the clinical application of this device. Thus, the aim of this study was 2-fold: to characterize the mechanical behavior of the Acapella, and to develop a software tool to ease the practical use of this device. An experimental setup was assembled in order to study mean pressure, oscillation frequency, and the oscillation amplitudes produced by 3 Acapella devices (model green) in the whole range of instrument adjustments and under air flow rates ranging from 200 mL/s to 800 mL/s. In order to increase flexibility, allowing the fast integration of further information obtained in future studies, the software was developed in a graphical environment. The device characterization showed an oscillation frequency varying from 8 Hz to 21 Hz, mean pressure ranging from 3 cm H2O to 23 cm H2O, and oscillation amplitude from 4 cm H2O to 9 cm H2O. These parameters increased with flow and instrument adjustment. A user-friendly software was developed, incorporating the current knowledge concerning secretion removal. After the introduction of the desired frequency and the patient air flow by the user, the software automatically calculates the necessary instrument adjustment, as well as mean pressure and oscillation amplitude. The Acapella device may produce clinically adequate values of mean pressure and oscillation frequency. However, it depends on its use at optimized conditions. The user

  3. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  4. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  5. hydropower biological evaluation tools

    SciTech Connect

    2016-10-06

    This software is a set of analytical tools to evaluate the physical and biological performance of existing, refurbished, or newly installed conventional hydro-turbines nationwide where fish passage is a regulatory concern. The current version is based on information collected by the Sensor Fish. Future version will include other technologies. The tool set includes data acquisition, data processing, and biological response tools with applications to various turbine designs and other passage alternatives. The associated database is centralized, and can be accessed remotely. We have demonstrated its use for various applications including both turbines and spillways

  6. Advanced Tools for Software Maintenance.

    DTIC Science & Technology

    1982-12-01

    Old Applications ...... 118 11.3.3 Training People to Use New Tools ......... 119 Appendix A. Ada Style Guidelines . . . . . . . . . . . . . 121...and application -specific programming techniques and methods. - The Intelligent Editor provides facilities for manipulating programs at several...are applicable today or in the near future. In identifying tools and techniques, this study focused on one aspect of the maintenance problem

  7. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  8. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  9. Software Tools for Nonlinear Missile Autopilot Design

    DTIC Science & Technology

    1999-01-01

    Copyright 1999 by Optimal Synthesis . All Rights Reserved. Software Tools for Nonlinear Missile Autopilot Design P.K. Menon * , V.R. Iragavarapu...and G. Sweriduk ‡ Optimal Synthesis Inc. 470 San Antonio Road, Suite 200 Palo Alto, CA 94306 E. J. Ohlmeyer § Naval Surface Warfare Center Dahlgren, VA...Abstract A computer-aided design software package for nonlinear control synthesis is discussed. The software incorporates five different modern

  10. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  11. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  12. NASA PC software evaluation project

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  13. TOOLPACK1. Software Tools for FORTRAN 77

    SciTech Connect

    Cowell, W.R.

    1992-02-18

    TOOLPACK1 consists of the following categories of software: (1) an integrated collection of tools intended to support the development and maintenance of Fortran 77 programs, in particular moderate-sized collections of mathematical software; (2) three user/Toolpack interfaces, one of which is selected for use at any particular installation; (3) three implementations of the tool/system interface, called TIE (Tool Interface to the Environment). The tools are written in Fortran 77 and are portable among TIE installations. The source contains symbolic constants as macro names and must be expanded with a suitable macro expander before being compiled and loaded. A portable macro expander is supplied in TOOLPACK1. The tools may be divided into three functional areas: general, documentation, and FORTRAN processing. One tool, the macro processor, can be used in any of these categories.

  14. Tool Support for Software Lookup Table Optimization

    DOE PAGES

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  15. Tool support for software lookup table optimization

    PubMed Central

    Strout, Michelle Mills; Bieman, James M.

    2012-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  16. Man versus Machine: Software Training for Surgeons—An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance

    PubMed Central

    Smith, Phillip; Sharma, Anant; Jones, Simon; Sullivan, Paul

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of −0.6792619 (p = 0.001), −0.6652021 (p = 0.002), and −0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills. PMID:27867658

  17. Man versus Machine: Software Training for Surgeons-An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance.

    PubMed

    Din, Nizar; Smith, Phillip; Emeriewen, Krisztina; Sharma, Anant; Jones, Simon; Wawrzynski, James; Tang, Hongying; Sullivan, Paul; Caputo, Silvestro; Saleh, George M

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of -0.6792619 (p = 0.001), -0.6652021 (p = 0.002), and -0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills.

  18. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  19. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  20. Software development tools: A bibliography, appendix C.

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.

    1980-01-01

    A bibliography containing approximately 200 citations on tools which help software developers perform some development task (such as text manipulation, testing, etc.), and which would not necessarily be found as part of a computing facility is given. The bibliography comes from a relatively random sampling of the literature and is not complete. But it is indicative of the nature and range of tools currently being prepared or currently available.

  1. Verifying nuclear fuel assemblies in wet storages on a partial defect level: A software simulation tool for evaluating the capabilities of the Digital Cherenkov Viewing Device

    NASA Astrophysics Data System (ADS)

    Grape, Sophie; Jacobsson Svärd, Staffan; Lindberg, Bo

    2013-01-01

    The Digital Cherenkov Viewing Device (DCVD) is an instrument that records the Cherenkov light emitted from irradiated nuclear fuels in wet storages. The presence, intensity and pattern of the Cherenkov light can be used by the International Atomic Energy Agency (IAEA) inspectors to verify that the fuel properties comply with declarations. The DCVD is since several years approved by the IAEA for gross defect verification, i.e. to control whether an item in a storage pool is a nuclear fuel assembly or a non-fuel item [1]. Recently, it has also been endorsed as a tool for partial defect verification, i.e. to identify if a fraction of the fuel rods in an assembly have been removed or replaced. The latter recognition was based on investigations of experimental studies on authentic fuel assemblies and of simulation studies on hypothetic cases of partial defects [2]. This paper describes the simulation methodology and software which was used in the partial defect capability evaluations. The developed simulation procedure uses three stand-alone software packages: the ORIGEN-ARP code [3] used to obtain the gamma-ray spectrum from the fission products in the fuel, the Monte Carlo toolkit Geant4 [4] for simulating the gamma-ray transport in and around the fuel and the emission of Cherenkov light, and the ray-tracing programme Zemax [5] used to model the light transport through the assembly geometry to the DCVD and to mimic the behaviour of its lens system. Furthermore, the software allows for detailed information from the plant operator on power and/or burnup distributions to be taken into account to enhance the authenticity of the simulated images. To demonstrate the results of the combined software packages, simulated and measured DCVD images are presented. A short discussion on the usefulness of the simulation tool is also included.

  2. Software and tools for microarray data analysis.

    PubMed

    Mehta, Jai Prakash; Rani, Sweta

    2011-01-01

    A typical microarray experiment results in series of images, depending on the experimental design and number of samples. Software analyses the images to obtain the intensity at each spot and quantify the expression for each transcript. This is followed by normalization, and then various data analysis techniques are applied on the data. The whole analysis pipeline requires a large number of software to accurately handle the massive amount of data. Fortunately, there are large number of freely available and commercial software to churn the massive amount of data to manageable sets of differentially expressed genes, functions, and pathways. This chapter describes the software and tools which can be used to analyze the gene expression data right from the image analysis to gene list, ontology, and pathways.

  3. Evaluating Digital Authoring Tools

    ERIC Educational Resources Information Center

    Wilde, Russ

    2004-01-01

    As the quality of authoring software increases, online course developers become less reliant on proprietary learning management systems, and develop skills in the design of original, in-house materials and the delivery platforms for them. This report examines the capabilities of digital authoring software tools for the development of learning…

  4. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  5. A software tool for dataflow graph scheduling

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1994-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.

  6. A Methodology for Software Evaluation.

    ERIC Educational Resources Information Center

    Comer, Priscilla Garrido; Geissler, Colin

    Evaluators of software for education must make a series of decisions about which issues have a direct impact on their choice of software. Instructional context analysis is the first step, including identifying the learners, the instructor, the learning environment, and technical needs and limitations. The next step is instructional goal analysis;…

  7. Electronic Resources Evaluation Central: Using Off-the-Shelf Software, Web 2.0 Tools, and LibGuides to Manage an Electronic Resources Evaluation Process

    ERIC Educational Resources Information Center

    England, Lenore; Fu, Li

    2011-01-01

    A critical part of electronic resources management, the electronic resources evaluation process is multi-faceted and includes a seemingly endless range of resources and tools involving numerous library staff. A solution is to build a Web site to bring all of the components together that can be implemented quickly and result in an organizational…

  8. Electronic Resources Evaluation Central: Using Off-the-Shelf Software, Web 2.0 Tools, and LibGuides to Manage an Electronic Resources Evaluation Process

    ERIC Educational Resources Information Center

    England, Lenore; Fu, Li

    2011-01-01

    A critical part of electronic resources management, the electronic resources evaluation process is multi-faceted and includes a seemingly endless range of resources and tools involving numerous library staff. A solution is to build a Web site to bring all of the components together that can be implemented quickly and result in an organizational…

  9. Test and Evaluation of WiMAX Performance Using Open-Source Modeling and Simulation Software Tools

    DTIC Science & Technology

    2010-12-01

    Computer Engineering Department at the Georgia Institute of Technology, the Electric Engineering Department at the University of Washington, and the Google...of WiMAX 31(4) N December 2010 519 A search on the Association of Computing Ma- chinery (ACM) portal for articles involving the keyword, ‘‘ns-3...diminishes usability but also denies the user the ability to graphically specify network topologies. A freeware tool called Radio Mobile (RM) ( Coude n.d

  10. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  11. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  12. Structure and software tools of AIDA.

    PubMed

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  13. Management of Astronomical Software Projects with Open Source Tools

    NASA Astrophysics Data System (ADS)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  14. Video-Tracking-Box linked to Smart software as a tool for evaluation of locomotor activity and orientation in brain-injured rats.

    PubMed

    Otero, Laura; Zurita, Mercedes; Aguayo, Concepción; Bonilla, Celia; Rodríguez, Alicia; Vaquero, Jesús

    2010-04-30

    Injuries of the Central Nervous System (CNS) cause devastating and irreversible losses of function. In order to analyze the deficits subsequent to brain injury it is necessary to use behavioral tests which evaluate cerebral dysfunction. In this study, we describe a new tool, the Video-Tracking-Box (VTB) linked to Smart software. This new method adequately quantifies parameters related to locomotor activity and orientation in brain-injured rats. This method has been used in our laboratory in order to measure behavioral outcome after brain injury caused by intracerebral hemorrhage (ICH) in adult Wistar rats. In our experimental model, ICH was induced by stereotactic injection of 0.5U of collagenase type IV in striatum. ICH injured rats decreased its motor coordination and presented deficits in cognitive memory. VTB-Smart test was sensitive to chronic locomotor and orientation dysfunction, and it was performed between 1 and 5 months after ICH. Our results revealed a significant increase in motor latency and loss of spatial orientation in the damaged-animals compared with intact animals. The data demonstrate that our VTB, joined to Smart software, offers a reliable measure to assess motor dysfunction and orientation after brain injury.

  15. Westinghouse waste simulation and optimization software tool

    SciTech Connect

    Mennicken, Kim; Aign, Jorg

    2013-07-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner, taking

  16. Evolution of Educational Software Evaluation: Instructional Software Assessment

    ERIC Educational Resources Information Center

    Bayram, Servet; Nous, Albert P.

    2004-01-01

    Many popular terms such as software description, software review, software evaluation and lastly software usability used by design, development and evaluation experts in the field. However, such terms used interchangeably by researchers and developers are syntactically and semantically different due to their conceptual backgrounds and the…

  17. TREHS: An open-access software tool for investigating and evaluating temporary river regimes as a first step for their ecological status assessment.

    PubMed

    Gallart, Francesc; Cid, Núria; Latron, Jérôme; Llorens, Pilar; Bonada, Núria; Jeuffroy, Justin; Jiménez-Argudo, Sara-María; Vega, Rosa-María; Solà, Carolina; Soria, Maria; Bardina, Mònica; Hernández-Casahuga, Antoni-Josep; Fidalgo, Aránzazu; Estrela, Teodoro; Munné, Antoni; Prat, Narcís

    2017-12-31

    When the regime of a river is not perennial, there are four main difficulties with the use of hydrographs for assessing hydrological alteration: i) the main hydrological features relevant for biological communities are not quantitative (discharges) but qualitative (phases such as flowing water, stagnant pools or lack of surface water), ii) stream flow records do not inform on the temporal occurrence of stagnant pools, iii) as most of the temporary streams are ungauged, their regime has to be evaluated by alternative methods such as remote sensing or citizen science, and iv) the biological quality assessment of the ecological status of a temporary stream must follow a sampling schedule and references adapted to the flow- pool-dry regime. To overcome these challenges within an operational approach, the freely available software tool TREHS has been developed within the EU LIFE TRIVERS project. This software permits the input of information from flow simulations obtained with any rainfall-runoff model (to set an unimpacted reference stream regime) and compares this with the information obtained from flow gauging records (if available) and interviews with local people, as well as instantaneous observations by individuals and interpretation of ground-level or aerial photographs. Up to six metrics defining the permanence of water flow, the presence of stagnant pools and their temporal patterns of occurrence are used to determine natural and observed river regimes and to assess the degree of hydrological alteration. A new regime classification specifically designed for temporary rivers was developed using the metrics that measure the relative permanence of the three main phases: flow, disconnected pools and dry stream bed. Finally, the software characterizes the differences between the natural and actual regimes, diagnoses the hydrological status (degree of hydrological alteration), assesses the significance and robustness of the diagnosis and recommends the best periods

  18. Evaluating software testing strategies

    NASA Technical Reports Server (NTRS)

    Selby, R. W., Jr.; Basili, V. R.; Page, J.; Mcgarry, F. E.

    1984-01-01

    The strategies of code reading, functional testing, and structural testing are compared in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. The major results are the following: (1) Code readers detected more faults than did those using the other techniques, while functional tester detected more faults than did structural testers; (2) Code readers had a higher fault detection rate than did those using the other methods, while there was no difference between functional testers and structural testers; (3) Subjects testing the abstract data type detected the most faults and had the highest fault detection rate, while individuals testing the database maintainer found the fewest faults and spent the most effort testing; (4) Subjects of intermediate and junior expertise were not different in number or percentage of faults found, fault detection rate, or fault detection effort; (5) subjects of advanced expertise found a greater number of faults than did the others, found a greater percentage of faults than did just those of junior expertise, and were not different from the others in either fault detection rate or effort; and (6) Code readers and functional testers both detected more omission faults and more control faults than did structural testers, while code readers detected more interface faults than did those using the other methods.

  19. Treatment Deployment Evaluation Tool

    SciTech Connect

    M. A. Rynearson; M. M. Plum

    1999-08-01

    The U.S. Department of Energy (DOE) is responsible for the final disposition of legacy spent nuclear fuel (SNF). As a response, DOE's National Spent Nuclear Fuel Program (NSNFP) has been given the responsibility for the disposition of DOE-owned SNF. Many treatment technologies have been identified to treat some forms of SNF so that the resulting treated product is acceptable by the disposition site. One of these promising treatment processes is the electrometallurgical treatment (EMT) currently in development; a second is an Acid Wash Decladding process. The NSNFP has been tasked with identifying possible strategies for the deployment of these treatment processes in the event that a treatment path is deemed necessary. To support the siting studies of these strategies, economic evaluations are being performed to identify the least-cost deployment path. This model (tool) was developed to consider the full scope of costs, technical feasibility, process material disposition, and schedule attributes over the life of each deployment alternative. Using standard personal computer (PC) software, the model was developed as a comprehensive technology economic assessment tool using a Life-Cycle Cost (LCC) analysis methodology. Model development was planned as a systematic, iterative process of identifying and bounding the required activities to dispose of SNF. To support the evaluation process, activities are decomposed into lower level, easier to estimate activities. Sensitivity studies can then be performed on these activities, defining cost issues and testing results against the originally stated problem.

  20. Treatment Deployment Evaluation Tool

    SciTech Connect

    Rynearson, Michael Ardel; Plum, Martin Michael

    1999-08-01

    The U.S. Department of Energy (DOE) is responsible for the final disposition of legacy spent nuclear fuel (SNF). As a response, DOE's National Spent Nuclear Fuel Program (NSNFP) has been given the responsibility for the disposition of DOE -owned SNF. Many treatment technologies have been identified to treat some forms of SNF so that the resulting treated product is acceptable by the disposition site. One of these promising treatment processes is the electrometallurgical treatment (EMT) currently in development; a second is an Acid Wash Decladding process. The NSNFP has been tasked with identifying possible strategies for the deployment of these treatment processes in the event that the treatment path is deemed necessary. To support the siting studies of these strategies, economic evaluations are being performed to identify the least-cost deployment path. This model (tool) was developed to consider the full scope of costs, technical feasibility, process material disposition, and schedule attributes over the life of each deployment alternative. Using standard personal computer (PC) software, the model was developed as a comprehensive technology economic assessment tool using a Life-Cycle Cost (LCC) analysis methodology. Model development was planned as a systematic, iterative process of identifying and bounding the required activities to dispose of SNF. To support the evaluation process, activities are decomposed into lower level, easier to estimate activities. Sensitivity studies can then be performed on these activities, defining cost issues and testing results against the originally stated problem.

  1. Assessment tool for pharmacy drug-drug interaction software.

    PubMed

    Warholak, Terri L; Hines, Lisa E; Saverno, Kim R; Grizzle, Amy J; Malone, Daniel C

    2011-01-01

    To assess the performance of pharmacy clinical decision support (CDS) systems for drug-drug interaction (DDI) detection and to identify approaches for improving the ability to recognize important DDIs. Pharmacists rely on CDS systems to assist in the identification of DDIs, and research suggests that these systems perform suboptimally. The software evaluation tool described here may be used in all pharmacy settings that use electronic decision support to detect potential DDIs, including large and small community chain pharmacies, community independent pharmacies, hospital pharmacies, and governmental facility pharmacies. A tool is provided to determine the ability of pharmacy CDS systems to identify established DDIs. It can be adapted to evaluate potential DDIs that reflect local practice patterns and patient safety priorities. Beyond assessing software performance, going through the evaluation processes creates the opportunity to evaluate inadequacies in policies, procedures, workflow, and training of all pharmacy staff relating to pharmacy information systems and DDIs. The DDI evaluation tool can be used to assess pharmacy information systems' ability to recognize relevant DDIs. Suggestions for improvement include determining whether the software allows for customization, creating standard policies for handling specific interactions, and ensuring that drug knowledge database updates occur frequently.

  2. BoBB, software to assess soil erosion risk - introduction of the tool and its use to evaluate appropriate crops and farming practices on endangered field plots

    NASA Astrophysics Data System (ADS)

    Devátý, Jan; Dostál, Tomáš; Hösl, Rosemarie; Strauss, Peter; Novotný, Ivan

    2013-04-01

    BoBB (Bodenerosion, Beratung, Berechnung) is simple software to support instant assessment of soil erosion hazard on agricultural fields. The program is profile-oriented, implementing the RUSLE model with slight changes allowing it to assess and compare different farming practices especially the soil-conservation field management. The input parameters datasets are supplied with necessary data for territory and natural conditions of Upper Austria but are generally usable for Central Europe. The software was developed on Federal Agency for Water Management, Petzenkirchen, Austria in 2011 - 2012. BAW and CTU in Prague are recently cooperating on validation and practical applicability approval of the model. Basic validation was done by comparing the outputs of the BoBB software with outputs of the original RUSLE model calculated by the RUSLE1 (USDA, 1998) and RUSLE2 (USDA, 2005) softwares. Further evaluation was performed to test the possibilities of BoBB to reveal field plots endangered by soil erosion. First, testing areas were selected out of a map of soil erosion risk, which had been calculated for the whole territory of the Czech Republic using a combination of the USLE approach and a GIS approach referring to the best available data set. This map in 10x10 meters resolution is used as basic source for assessment of soil erosion hazard and for necessity of GAEC requirements (Good agricultural practices assessment for agricultural subsidy policy) and is therefore accepted as standard at state level. Characteristic profiles were selected within defined testing areas and soil erosion hazard, determined by the USLE approach and BoBB have then been compared. Second, a comparison of BoBB outputs and database of soil erosion events (http://me.vumop.cz) was carried out. The database is created and maintained by the Czech Institute of Soil Conservation as a unique tool for soil erosion mapping and documentation. It was launched in 2010 and recently contains approximately

  3. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  4. Authoring tool evaluation

    SciTech Connect

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.

    1994-09-15

    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  5. STAYSL PNNL Suite of Software Tools.

    SciTech Connect

    GREENWOOD, LARRY R.

    2013-07-19

    Version: 00 The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  6. Evaluation Methodology for Software Engineering

    DTIC Science & Technology

    1988-05-31

    Recall that the standard QWERTY keyboard format initially was chosen because it would slow performance and thus prevent the jamming of keys; this...retention may become a software parallel to the QWERTY keyboard. Given this stratification of the problem domain, what evaluation methods can be applied? It

  7. The LIBS Internet Access Software: An Overview and Evaluation.

    ERIC Educational Resources Information Center

    Stanton, Deidre E.; Hooper, Todd

    1992-01-01

    Describes and evaluates LIBS Internet Access Software (also called Sonoma Software), which offers automatic Telnet connection to remote library catalogs, databases, information services, campuswide information services, and other wide-area information access tools. Instructions for obtaining and installing the software are given, and a comparison…

  8. The LIBS Internet Access Software: An Overview and Evaluation.

    ERIC Educational Resources Information Center

    Stanton, Deidre E.; Hooper, Todd

    1992-01-01

    Describes and evaluates LIBS Internet Access Software (also called Sonoma Software), which offers automatic Telnet connection to remote library catalogs, databases, information services, campuswide information services, and other wide-area information access tools. Instructions for obtaining and installing the software are given, and a comparison…

  9. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  10. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  11. A Software Communication Tool for the Tele-ICU

    PubMed Central

    Pimintel, Denise M.; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider. PMID:24551398

  12. SAS: a yield/failure analysis software tool

    NASA Astrophysics Data System (ADS)

    de Jong Perez, Susana

    1996-09-01

    As the device sizes decrease and the number of interconnect levels and wafer size increase, the device yield and failure analysis becomes more complex. Currently, software tools are being used to perform visual inspection techniques after many operations during which defects are detected on a sample of wafers. However, it has been observed that the correlation between the yield predicted on the basis of the defects found during such observations and the yield determined electrically at wafer final test is low. Of a greater interest to yield/failure analysis software tools is statistical analysis software. SASTM can perform extensive data analysis on kerf test structures' electrical parameters. In addition, the software can merge parametric and yield/fail bins data which reduces the data collection and data reduction activities involved in the correlation of device parameters to circuit functional operation. The data is saved in large databases which allow storage and later retrieval of historical data in order to evaluate process shifts and changes and their effect on yield. The merge of process parameters and on-line measurements with final electrical data, is also possible with the aid of process parameter extraction software. All of this data analysis provides excellent feedback about integrated circuit wafer processing.

  13. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  14. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  15. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  16. Software Metrics Useful Tools or Wasted Measurements

    DTIC Science & Technology

    1990-05-01

    shared by the developers of the field of software metrics. Capers Jones, Chairman of Software Productivity Research, Inc. and a noted pioneer in...development efforts in terms of function points. That will give you a basis for measuring productivity. Capers Jones, chairman of Software... Capers Jones, "Building a better metric," Computerworld Extra, 22 (June 20, 1988):39. 24 ALlen J. Albrecht and John E. Gaffney, Jr., "Software Function

  17. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease.

    PubMed

    Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk

    2016-06-01

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.

  18. Clinical evaluation of a dose monitoring software tool based on Monte Carlo Simulation in assessment of eye lens doses for cranial CT scans.

    PubMed

    Guberina, Nika; Suntharalingam, Saravanabavaan; Naßenstein, Kai; Forsting, Michael; Theysohn, Jens; Wetter, Axel; Ringelstein, Adrian

    2016-10-01

    The aim of this study was to verify the results of a dose monitoring software tool based on Monte Carlo Simulation (MCS) in assessment of eye lens doses for cranial CT scans. In cooperation with the Federal Office for Radiation Protection (Neuherberg, Germany), phantom measurements were performed with thermoluminescence dosimeters (TLD LiF:Mg,Ti) using cranial CT protocols: (I) CT angiography; (II) unenhanced, cranial CT scans with gantry angulation at a single and (III) without gantry angulation at a dual source CT scanner. Eye lens doses calculated by the dose monitoring tool based on MCS and assessed with TLDs were compared. Eye lens doses are summarized as follows: (I) CT angiography (a) MCS 7 mSv, (b) TLD 5 mSv; (II) unenhanced, cranial CT scan with gantry angulation, (c) MCS 45 mSv, (d) TLD 5 mSv; (III) unenhanced, cranial CT scan without gantry angulation (e) MCS 38 mSv, (f) TLD 35 mSv. Intermodality comparison shows an inaccurate calculation of eye lens doses in unenhanced cranial CT protocols at the single source CT scanner due to the disregard of gantry angulation. On the contrary, the dose monitoring tool showed an accurate calculation of eye lens doses at the dual source CT scanner without gantry angulation and for CT angiography examinations. The dose monitoring software tool based on MCS gave accurate estimates of eye lens doses in cranial CT protocols. However, knowledge of protocol and software specific influences is crucial for correct assessment of eye lens doses in routine clinical use.

  19. SAPHIRE models and software for ASP evaluations

    SciTech Connect

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D.

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  20. VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1994-01-01

    VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.

  1. The safety-critical software evaluation assistant (SEA)

    SciTech Connect

    Persons, W.L.

    1995-10-01

    The Computer Safety and Reliability Group at Lawrence Livermore National Laboratory (LLNL) is researching the evaluation of software used in safety-critical applications. This paper describes one, of the research and development efforts currently underway to model the software evaluation process and to develop a software evaluation tool. One of the primary techniques available for determining the safety of software proposed for use in safety-critical applications is to evaluate the software development process and the resulting products. This model of the evaluation process was influenced by several factors the underlying motivation was to identify, control and reduce the risk inherent in building safety-critical software systems. This prototype tool, the Software Evaluation Assistant (SEA), assists and guides evaluators as they analyze safety-critical software. SEA describes specific evaluation goals, provides a brief overview of the specific evaluation process, identifies potential, risks of not performing the evaluation, identifies the skills required to carry out the evaluation of a particular topic, identifies the material that should typically be available for the evaluation, and poses questions used to examine and rate the software item.

  2. An evaluation of the Interactive Software Invocation System (ISIS) for software development applications. [flight software

    NASA Technical Reports Server (NTRS)

    Noland, M. S.

    1981-01-01

    The Interactive Software Invocation System (ISIS), which allows a user to build, modify, control, and process a total flight software system without direct communications with the host computer, is described. This interactive data management system provides the user with a file manager, text editor, a tool invoker, and an Interactive Programming Language (IPL). The basic file design of ISIS is a five level hierarchical structure. The file manager controls this hierarchical file structure and permits the user to create, to save, to access, and to purge pages of information. The text editor is used to manipulate pages of text to be modified and the tool invoker allows the user to communicate with the host computer through a RUN file created by the user. The IPL is based on PASCAL and contains most of the statements found in a high-level programming language. In order to evaluate the effectiveness of the system as applied to a flight project, the collection of software components required to support the Annular Suspension and Pointing System (ASPS) flight project were integrated using ISIS. The ASPS software system and its integration into ISIS is described.

  3. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  4. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  5. A Tool for Managing Software Architecture Knowledge

    SciTech Connect

    Babar, Muhammad A.; Gorton, Ian

    2007-08-01

    This paper describes a tool for managing architectural knowledge and rationale. The tool has been developed to support a framework for capturing and using architectural knowledge to improve the architecture process. This paper describes the main architectural components and features of the tool. The paper also provides examples of using the tool for supporting wellknown architecture design and analysis methods.

  6. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  7. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  8. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  9. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  10. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  11. Cyber Security Evaluation Tool

    SciTech Connect

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  12. Evacuation performance evaluation tool.

    PubMed

    Farra, Sharon; Miller, Elaine T; Gneuhs, Matthew; Timm, Nathan; Li, Gengxin; Simon, Ashley; Brady, Whittney

    2016-01-01

    Hospitals conduct evacuation exercises to improve performance during emergency events. An essential aspect in this process is the creation of reliable and valid evaluation tools. The objective of this article is to describe the development and implications of a disaster evacuation performance tool that measures one portion of the very complex process of evacuation. Through the application of the Delphi technique and DeVellis's framework, disaster and neonatal experts provided input in developing this performance evaluation tool. Following development, content validity and reliability of this tool were assessed. Large pediatric hospital and medical center in the Midwest. The tool was pilot tested with an administrative, medical, and nursing leadership group and then implemented with a group of 68 healthcare workers during a disaster exercise of a neonatal intensive care unit (NICU). The tool has demonstrated high content validity with a scale validity index of 0.979 and inter-rater reliability G coefficient (0.984, 95% CI: 0.948-0.9952). The Delphi process based on the conceptual framework of DeVellis yielded a psychometrically sound evacuation performance evaluation tool for a NICU.

  13. Workshop on Software Development Tools for Petascale Computing

    SciTech Connect

    Vetter, Jeffrey

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  14. Tools Ensure Reliability of Critical Software

    NASA Technical Reports Server (NTRS)

    2012-01-01

    In November 2006, after attempting to make a routine maneuver, NASA's Mars Global Surveyor (MGS) reported unexpected errors. The onboard software switched to backup resources, and a 2-day lapse in communication took place between the spacecraft and Earth. When a signal was finally received, it indicated that MGS had entered safe mode, a state of restricted activity in which the computer awaits instructions from Earth. After more than 9 years of successful operation gathering data and snapping pictures of Mars to characterize the planet's land and weather communication between MGS and Earth suddenly stopped. Months later, a report from NASA's internal review board found the spacecraft's battery failed due to an unfortunate sequence of events. Updates to the spacecraft's software, which had taken place months earlier, were written to the wrong memory address in the spacecraft's computer. In short, the mission ended because of a software defect. Over the last decade, spacecraft have become increasingly reliant on software to carry out mission operations. In fact, the next mission to Mars, the Mars Science Laboratory, will rely on more software than all earlier missions to Mars combined. According to Gerard Holzmann, manager at the Laboratory for Reliable Software (LaRS) at NASA's Jet Propulsion Laboratory (JPL), even the fault protection systems on a spacecraft are mostly software-based. For reasons like these, well-functioning software is critical for NASA. In the same year as the failure of MGS, Holzmann presented a new approach to critical software development to help reduce risk and provide consistency. He proposed The Power of 10: Rules for Developing Safety-Critical Code, which is a small set of rules that can easily be remembered, clearly relate to risk, and allow compliance to be verified. The reaction at JPL was positive, and developers in the private sector embraced Holzmann's ideas.

  15. The Mainsail Project: Developing Tools for Software Portability

    PubMed Central

    Wilcox, Clark R.

    1977-01-01

    The MAINSAIL project is part of the SUMEX Computer Project funded by the Biotechnology Resources Program, National Institutes of Health. Its basic goal is to provide a machine-independent programming system suitable for the development of large, portable programs. MAINSAIL is an ALGOL-like language with dynamic memory support for strings, arrays, records, modules and files. In this paper we give an overview of how portability is achieved, and of several approaches to code generation and execution. MAINSAIL, both a practical language and a research tool, provides a unique opportunity to evaluate alternative approaches to software portability.

  16. Considerations for the Evaluation of Software.

    ERIC Educational Resources Information Center

    Fields, Thomas A.

    1984-01-01

    The paper describes the decision process in the determination of software purchase for the professional's use of the microcomputer. Consideration to ensure that the software purchased will be a success, which software will be needed, and options that will facilitate the location and evaluation of software are discussed. (Author/CL)

  17. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  18. EISA 432 Energy Audits Best Practices: Software Tools

    SciTech Connect

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  19. Software Selection, Evaluation and Organization [and] Software Reviews. Article Reprints.

    ERIC Educational Resources Information Center

    Computing Teacher, 1985

    1985-01-01

    This collection of reprints from The Computing Teacher contains 11 articles on the selection, evaluation, and organization of software published between August 1983 and March 1986, as well as more than 20 reviews of educational software packages published between December 1982 and June 1986. The articles are: (1) "The New Wave of Educational…

  20. Software Selection, Evaluation and Organization [and] Software Reviews. Article Reprints.

    ERIC Educational Resources Information Center

    Computing Teacher, 1985

    1985-01-01

    This collection of reprints from The Computing Teacher contains 11 articles on the selection, evaluation, and organization of software published between August 1983 and March 1986, as well as more than 20 reviews of educational software packages published between December 1982 and June 1986. The articles are: (1) "The New Wave of Educational…

  1. A Guide to Evaluated Educational Software. SEED Software Annotations.

    ERIC Educational Resources Information Center

    South Carolina Educational Television Network Columbia.

    Reviews of 142 education software packages are contained in this guide produced by Project SEED. Following listings of software by title, grade level, and subject area, the alphabetical list of evaluations provides information in the following areas for each program: (1) title; (2) producer; (3) copyright data; (4) grade level; (5) subject area;…

  2. Testing Automation Tools for Secure Software Development

    DTIC Science & Technology

    2007-06-01

    Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE June 2007 3. REPORT TYPE AND DATES...operation of the system. This technique is not particularly effective. Thus, recent research has focused on developing new , more effective software...operation of the system. This technique is not particularly effective. Thus, recent research has focused on developing new , more effective software

  3. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  4. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  5. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  6. NASA Approach to HPCCP Support Software and Tools

    NASA Technical Reports Server (NTRS)

    Blaylock, Bruce; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    The NASA HPCC Program, together with other agencies participating in the Federal HPCC Program, intends to advance technologies to enable the execution of grand challenge applications at sustained rates up to TeraFLOPS. During 1995-6 NASA undertook two major systems software efforts to improve the state of high performance support software and tools. The first of these activities was a replanning of support software and tools activities internal to the Agency. In replanning the software activities emphasis was placed on Meeting the needs of Grand Challenge Uses Few projects Near term useful results. The revised NASA plan calls for support software and tools activities in four areas: Application Creation Process Support Application Usage/Operations Support Advanced Support Software and Tools Concepts Metrics Based Monitoring and Management The second major activity undertaken was participation in a multiagency Task Force resulting from the Second Pasadena Workshop on System Software and Tools. The task force developed the Guidelines for Writing System Software and Tools Requirements for Parallel and Clustered Computers.

  7. Software for an Evaluation Workshop.

    ERIC Educational Resources Information Center

    Johnson, Judi Mathis

    1997-01-01

    Based on the "Educational Software Preview Guide," this article discusses trends; lists selected software titles related to the Internet, arts, multilingualism and multiculturalism, multimedia and edutainment, and authentic experience, cross curricular and alternative assessment; and describes how to conduct a software evaluation…

  8. Improving system quality through software evaluation.

    PubMed

    McDaniel, James G

    2002-05-01

    The role of evaluation is examined with respect to quality of software in healthcare. Of particular note is the failure of the Therac-25 radiation therapy machine. This example provides evidence of several types of defect which could have been detected and corrected using appropriate evaluation procedures. The field of software engineering has developed metrics and guidelines to assist in software evaluation but this example indicates that software evaluation must be extended beyond the formally defined interfaces of the software to its real-life operating context.

  9. Innovative Software Tools Measure Behavioral Alertness

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  10. Saphire models and software for ASP evaluations

    SciTech Connect

    Sattison, M.B.

    1997-02-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  11. The Toxicity Estimation Software Tool (T.E.S.T.)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to estimate toxicological values for aquatic and mammalian species considering acute and chronic endpoints for screening purposes within TSCA and REACH programs.

  12. The Toxicity Estimation Software Tool (T.E.S.T.)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to estimate toxicological values for aquatic and mammalian species considering acute and chronic endpoints for screening purposes within TSCA and REACH programs.

  13. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  14. Developing a Decision Support System: The Software and Hardware Tools.

    ERIC Educational Resources Information Center

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  15. NASA-Enhanced Version Of Automatically Programmed Tool Software (APT)

    NASA Technical Reports Server (NTRS)

    Purves, L. R.

    1989-01-01

    APT code one of most widely used software tools for complex numerically-controlled machining. Both a programming language and software that processes language. Upgrades include super pocket for concave polygon pockets and editor to reprocess cutter location coordinates according to user-supplied commands.

  16. ISWHM: Tools and Techniques for Software and System Health Management

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  17. iPhone examination with modern forensic software tools

    NASA Astrophysics Data System (ADS)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  18. Software tool for xenon gamma-ray spectrometer control

    NASA Astrophysics Data System (ADS)

    Chernysheva, I. V.; Novikov, A. S.; Shustov, A. E.; Dmitrenko, V. V.; Pyae Nyein, Sone; Petrenko, D.; Ulin, S. E.; Uteshev, Z. M.; Vlasik, K. F.

    2016-02-01

    Software tool "Acquisition and processing of gamma-ray spectra" for xenon gamma-ray spectrometers control was developed. It supports the multi-windows interface. Software tool has the possibilities for acquisition of gamma-ray spectra from xenon gamma-ray detector via USB or RS-485 interfaces, directly or via TCP-IP protocol, energy calibration of gamma-ray spectra, saving gamma-ray spectra on a disk.

  19. EDCATS: An Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Heard, Pamala D.

    1998-01-01

    The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.

  20. Integrated software tool automates MOV diagnosis

    SciTech Connect

    Joshi, B.D.; Upadhyaya, B.R.

    1996-04-01

    This article reports that researchers at the University of Tennessee have developed digital signal processing software that takes the guesswork out of motor current signature analysis (MCSA). The federal testing regulations for motor-operated valves (MOV) used in nuclear power plants have recently come under critical scrutiny by the Nuclear Regulatory Commission (NRC) and the American Society of Mechanical Engineers (ASME). New ASME testing specifications mandate that all valves performing a safety function are to be tested -- not just ASME Code 1, 2 and 3 valves. The NRC will likely endorse the ASME regulations in the near future. Because of these changes, several utility companies have voluntarily expanded the scope of their in-service testing programs for MOVs, in spite of the additional expense.

  1. A Software-System Visualization Tool

    DTIC Science & Technology

    1990-04-01

    databases. servers) and the dependencies established between these modules (e.g., procedure calls, message passing channels). \\ Ve use the tool to write...A MINION user can view these implicitly constructed connections and notice incorrect links to library modules and "regions" of a graph. We can scale ...configuration programming systems. MINION handles this problem by hiding unimportant details or scaling them graphically. We can graphically select

  2. Software Engineering: Tools of the Profession

    DTIC Science & Technology

    1976-09-01

    hardware configuration. (D,G) Kernighan , Brian 9. and Plauger, P. J., The Elements of Programming S tyle, McGraw-Hill Book Company, 1974. This book consists...further reference material on the subject. Kernighan , Brian W. and Plauger, P. J., Softw are Tools, Addison-Wesley Publishing Company, 1976. The authors... Brian Randell on page 47 of the report: "There are two distinct approaches to the problem of deciding in what order to make design decisions. The top

  3. Software Tools for Stochastic Simulations of Turbulence

    DTIC Science & Technology

    2015-08-28

    application programming interface (API) for porting front tracking mod- els into arbitrary simulation codes is introduced. We discuss models for advecting...client interface to FTI. Specefic client programs using this interface include the weather forecasting code WRF; the high energy physics code , FLASH...coarse grid defines a Young measure solution to the PDE. The second tool is a front tracking application programming interface (API) called FTI. It has

  4. Software Tools for Acoustic Database Management

    DTIC Science & Technology

    1992-01-01

    and modification are anticipated. Most of the code for these programs was de- veloped using Turbo C++ (Borland International, Scotts Valley, CA); the...Research through the Ocean Acoustics Program ’ code 11250) Contract N00014-88-K-0273 and Grant N00014-J-1445 with supplemental support from NOARL ( code 211...bioacoustic laboratory to maintain and utilize an archive of digitized biological sounds. These tools are written in standard C code , and are designed to

  5. Concepts and tools for the software life cycle

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    The tools, techniques, and aids needed to engineer, manage, and administer a large software-intensive task are themselves parts of a large software base, and are incurred only at great expense. The needs of the software life cycle in terms of such supporting tools and methodologies are highlighted. The concept of a distributed network for engineering, management, and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software actively are discussed. A formal, deliberate, structured, systems-engineering approach for the construction of a uniform, coordinated tool set is proposed as a means to reduce development and maintenance costs, foster adaptability, enhance reliability, and promote standardization.

  6. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  7. Programming software for usability evaluation

    SciTech Connect

    Edwards, T.L.; Allen, H.W.

    1997-01-01

    This report provides an overview of the work completed for a portion of the User Interface Testbed for Technology Packaging (UseIT) project. The authors present software methods for programming systems to record and view interactions with a graphical user interface. A brief description of the human factors design process is presented. The software methods exploit features available in the X Window System and the operating system for Windows{trademark} 95 and Windows{trademark} NT{reg_sign}.

  8. Case Studies of Software Development Tools for Parallel Architectures

    DTIC Science & Technology

    1993-06-01

    67 PIE ...surveyed (descriptions of these, and all other tools mentioned in this report are provided in appendix B): GARDEN FIELD PIE Prometheus Faust CODE...PARALLEL SOFTWARE ENGINEERING PROBLEMS Tool Spec Design Co A1g. Par Dam Part Load Comp Cam Debug Reuse Nu Test Se Eval Dist Bal RefI /Test Procs PIE X X

  9. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  10. Forecasting trends in NASA flight software development tools

    NASA Technical Reports Server (NTRS)

    Garman, J. R.

    1983-01-01

    The experience gained in the design and development of Shuttle flight and ground support embedded software systems along with projections of increasing role and size of software in the proposed Space Station and other future NASA projects provides the basis for forecasting substantial changes in the tools and methodologies by which embedded software systems are developed and acquired. Similar changes in software architectures and operator interfaces will lead to substantial changes in the approach and techniques involved in software test and system integration. Increasing commonality among different flight systems and between flight and supporting ground systems is projected, along with a more distributed approach to software acquisition in highly complex projects such as Space Station.

  11. Software for Use with Optoelectronic Measuring Tool

    NASA Technical Reports Server (NTRS)

    Ballard, Kim C.

    2004-01-01

    A computer program has been written to facilitate and accelerate the process of measurement by use of the apparatus described in "Optoelectronic Tool Adds Scale Marks to Photographic Images" (KSC-12201). The tool contains four laser diodes that generate parallel beams of light spaced apart at a known distance. The beams of light are used to project bright spots that serve as scale marks that become incorporated into photographic images (including film and electronic images). The sizes of objects depicted in the images can readily be measured by reference to the scale marks. The computer program is applicable to a scene that contains the laser spots and that has been imaged in a square pixel format that can be imported into a graphical user interface (GUI) generated by the program. It is assumed that the laser spots and the distance(s) to be measured all lie in the same plane and that the plane is perpendicular to the line of sight of the camera used to record the image

  12. MOSS, an evaluation of software engineering techniques

    NASA Technical Reports Server (NTRS)

    Bounds, J. R.; Pruitt, J. L.

    1976-01-01

    An evaluation of the software engineering techniques used for the development of a Modular Operating System (MOSS) was described. MOSS is a general purpose real time operating system which was developed for the Concept Verification Test (CVT) program. Each of the software engineering techniques was described and evaluated based on the experience of the MOSS project. Recommendations for the use of these techniques on future software projects were also given.

  13. Software for systems biology: from tools to integrated platforms.

    PubMed

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2011-11-03

    Understanding complex biological systems requires extensive support from software tools. Such tools are needed at each step of a systems biology computational workflow, which typically consists of data handling, network inference, deep curation, dynamical simulation and model analysis. In addition, there are now efforts to develop integrated software platforms, so that tools that are used at different stages of the workflow and by different researchers can easily be used together. This Review describes the types of software tools that are required at different stages of systems biology research and the current options that are available for systems biology researchers. We also discuss the challenges and prospects for modelling the effects of genetic changes on physiology and the concept of an integrated platform.

  14. The "neuro-mapping locator" software. A real-time intraoperative objective paraesthesia mapping tool to evaluate paraesthesia coverage of the painful zone in patients undergoing spinal cord stimulation lead implantation.

    PubMed

    Guetarni, F; Rigoard, P

    2015-03-01

    Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming

  15. Risk Assessment Methodology for Software Supportability (RAMSS): guidelines for Adapting Software Supportability Evaluations

    DTIC Science & Technology

    1986-04-14

    implemented various software OT&E method - ologies. Two of these methods , Software Product maintainability evaluation and Software Support Resources evaluation... methods have matured and have become the Air Force standard for evaluating, software supportablllty. Each of these developed methods evaluates...assessment method which provides software testers with areas which require testing emphasis, and decision makers with an assessment of the software sup

  16. HANSIS software tool for the automated analysis of HOLZ lines.

    PubMed

    Holec, D; Sridhara Rao, D V; Humphreys, C J

    2009-06-01

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  17. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  18. A Checklist to Evaluate Mapping Software.

    ERIC Educational Resources Information Center

    Werner, Robert; Young, James

    1991-01-01

    Presents a checklist for evaluating commercially available mapping software. Analyzes software features by general categories including range of map types, availability and flexibility of data files, and program evaluation. Discusses ease of operation, the manual, tutorial, screens and help, error handling, design flexibility, hard copy output,…

  19. HALOE test and evaluation software

    NASA Technical Reports Server (NTRS)

    Edmonds, W.; Natarajan, S.

    1987-01-01

    Computer programming, system development and analysis efforts during this contract were carried out in support of the Halogen Occultation Experiment (HALOE) at NASA/Langley. Support in the major areas of data acquisition and monitoring, data reduction and system development are described along with a brief explanation of the HALOE project. Documented listings of major software are located in the appendix.

  20. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  1. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  2. Exoskeletons, Robots and System Software: Tools for the Warfighter

    DTIC Science & Technology

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots, drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  3. Software Tools in Endoscopy - Nice to Have or Essential?

    PubMed Central

    Möschler, Oliver

    2016-01-01

    Background Documentation of findings and of the treatment implications resulting from them is one of the central tasks involved in medical work. The introduction of software tools for managing and providing technical support for this task is a logical development. Methods A literature search was conducted in September 2015 using PubMed and the search terms ‘gastrointestinal endoscopy AND electronic documentation’ and ‘software tools AND gastrointestinal endoscopy AND documentation’. Results The requirements in relation to documentation, patient information and sedation, dealing with histological findings, materials logistics, recording video documents, and hygiene documentation are discussed. Conclusion Software tools are essential for managing basic documentation requirements. However, for many aspects of the documentation required in a modern endoscopy department, there are various - and sometimes substantial - gaps in the programs currently available. More intensive discussions need to take place regarding existing gaps and requirements, both with the suppliers concerned and among colleagues and specialist societies. PMID:27588294

  4. DEVICE CONTROL TOOL FOR CEBAF BEAM DIAGNOSTICS SOFTWARE

    SciTech Connect

    Pavel Chevtsov

    2008-02-11

    Continuously monitoring the beam quality in the CEBAF accelerator, a variety of beam diagnostics software created at Jefferson Lab makes a significant contribution to very high availability of the machine for nuclear physics experiments. The interface between this software and beam instrumentation hardware components is provided by a device control tool, which is optimized for beam diagnostics tasks. As a part of the device/driver development framework at Jefferson Lab, this tool is very easy to support and extend to integrate new beam instrumentation components. All device control functions are based on the configuration (ASCII text) files that completely define the used hardware interface standards (CAMAC, VME, RS-232, GPIB, etc.) and communication protocols. The paper presents the main elements of the device control tool for beam diagnostics software at Jefferson Lab.

  5. Concepts and Tools for the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    The tools, techniques, and aids needed to engineer, manage, and administer a large software-intensive task are themselves parts of a large softwaare base, and are incurred only at great expense. The needs of the software life cycle in terms of such supporting tools and methodologies are highlighted. The concept of a distributed network for engineering, management, and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineering approach for the construction of a uniform, coordinated tool set is proposed as a means to reduce development and maintenance costs, foster adaptability, enhance reliability, and promote standardization.

  6. Concepts and tools for the software life cycle

    NASA Astrophysics Data System (ADS)

    Tausworthe, Robert C.

    1985-10-01

    The life cycle process for large software-intensive systems is an extremely intricate and complex process involving many people performing amid a very large base of evolving computer programs, documentation and data. To be successful, the process must be well conceived, planned and conducted; however, the nature of scientific and other high-technology projects involving large-scale software is such that conceptualization, planning and implementation to the degree of detail required is so laborintensive and unmotivating as to be counter-productive and seldom cost-effective. The tools, techniques and aids needed to engineer, manage and administrate a large software-intensive task are themselves parts of a large software base, and are incurred only at great expense. This paper focuses on the needs of the software life cycle in terms of supporting tools and methodologies. The concept of a distributed network for engineering, management and administrative functions for engineering, management and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineered approach toward the construction of uniform, coordinated tools is proposed as a means to reduce development and maintenance costs, foster creativity, enhance reliability, promote standardization and sustain human motivation.

  7. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  8. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  9. Evaluation of science instructional software

    NASA Astrophysics Data System (ADS)

    Shrivastava, Abha

    The primary purpose of this study was to determine the extent to which the current science education software measures up to the national science literacy standards as specified by the AAAS. The usability of software was also assessed. A sample of ten widely used middle and high school science instruction CD-ROMs was chosen from a sampling frame comprising 17 bestsellers and 30 highly recommended software packages. The study used a five-step analysis and rating procedure derived closely from the one developed by the AAAS's Project 2061 for analyzing the content and pedagogy of widely used science textbooks. Each was also rated on a checklist of 20 usability criteria that were compiled from prior research on design, usability, and interactiveness of instructional software. The results indicate that an average of about 25% of a CD-ROM's content (which includes either text, graphics, or exercises, or a combination of any or all of these) addressed the AAAS benchmarks of science literacy, but anywhere from 12--45% of this content was found to address benchmarks for lower grades than those that the CD-ROM was claimed to target. Much of remaining content addressing the benchmarks did so only partially. Thus, overall, there was poor alignment with the benchmarks. The sample failed to meet pedagogical criteria associated with attending to students' pre-existing knowledge, ideas, and skills; assessing students' progress vis-a-vis the goals of the lesson on the CD-ROM; and promoting students' thinking and reflection about what they have learned. The CD-ROMs scored satisfactory to excellent on criteria associated with providing relevant and varied experiences with scientific phenomena and effective and meaningful representation of scientific terms and ideas. The sample scored satisfactorily on all usability criteria except on some design and interface matters. Popular CD-ROMs lack of alignment with the AAAS science standards may be due to laxness on the part of developers, a

  10. Microcomputer Software Engineering, Documentation and Evaluation

    DTIC Science & Technology

    1981-03-31

    microcomputer program called "EVAL." 4.1 The Evaluation Methodology At the core of EVAL lies an evaluation methodology known as multi-attribute utility theory ...Agent (RITA) : Reference Manual. Santa Monica, California: The Rand Corporation, December 1976. Edwards, W. "How to Use Multiattribute Utility ...structured programming, unconventional docu- mentation, and multi-attribute utility -based software evaluation. The general methods employed include software

  11. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  12. Knowledge engineering software: A demonstration of a high end tool

    SciTech Connect

    Salzman, G.C.; Krall, R.B.; Marinuzzi, J.G.

    1987-01-01

    Many investigators wanting to apply knowledge-based systems (KBS) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with the high end KBS tools such as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. To illustrate the capability of one of these high end tools we have converted a table showing the classification of benign soft tissue tumors into a KEE knowledge base. We have used the tools available in Kee to identify the tumor type for a hypothetical patient. 10 figs.

  13. PAnalyzer: a software tool for protein inference in shotgun proteomics.

    PubMed

    Prieto, Gorka; Aloria, Kerman; Osinalde, Nerea; Fullaondo, Asier; Arizmendi, Jesus M; Matthiesen, Rune

    2012-11-05

    Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and

  14. Selecting and Evaluating Software for Vocational Education.

    ERIC Educational Resources Information Center

    Rodenstein, Judith

    This handbook is intended to guide the vocational educator through the task of selecting and evaluating software for the vocational curriculum. Section 1 focuses on computer-based education. Chapter 1 defines computer-based education and the hardware and software required when the computer is used as an educational delivery system. Chapter 2…

  15. Computer software management, evaluation, and dissemination

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The activities of the Computer Software Management and Information Center involving the collection, processing, and distribution of software developed under the auspices of NASA and certain other federal agencies are reported. Program checkout and evaluation, inventory control, customer services and marketing, dissemination, program maintenance, and special development tasks are discussed.

  16. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  17. Design and implementation of the mobility assessment tool: software description.

    PubMed

    Barnard, Ryan T; Marsh, Anthony P; Rejeski, Walter Jack; Pecorella, Anthony; Ip, Edward H

    2013-07-23

    In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications--one built in Java, the other in Objective-C for the Apple iPad--were then built that could present the instrument described in the XML document and collect participants' responses. Separating the instrument's definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine.

  18. Automated software development tools in the MIS (Management Information Systems) environment

    SciTech Connect

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  19. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  20. Simple tools and software for precision weed mapping

    USDA-ARS?s Scientific Manuscript database

    Simple Tools and Software for Precision Weed Mapping L. Wiles If you have a color digital camera and a handheld GPS unit, you can map weed problems in your fields. German researchers are perfecting technology to map weed species and density with digital cameras for precision herbicide application. ...

  1. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  2. GenePRIMP: A software quality control tool

    SciTech Connect

    Amrita Pati

    2010-05-05

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  3. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2016-07-12

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  4. Chips: A Tool for Developing Software Interfaces Interactively.

    ERIC Educational Resources Information Center

    Cunningham, Robert E.; And Others

    This report provides a detailed description of Chips, an interactive tool for developing software employing graphical/computer interfaces on Xerox Lisp machines. It is noted that Chips, which is implemented as a collection of customizable classes, provides the programmer with a rich graphical interface for the creation of rich graphical…

  5. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  6. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    DTIC Science & Technology

    2003-06-01

    to steer me in the right directions, clarify my understanding of complex material, and offer suggestions of how to improve the dissertation were...tools developed at the Naval Postgraduate School. Evolved from CAPS and Distributed CAPS ( DCAPS ). Software Engineering Body of Knowledge (SWEBOK

  7. Role of Social Software Tools in Education: A Literature Review

    ERIC Educational Resources Information Center

    Minocha, Shailey

    2009-01-01

    Purpose: The purpose of this paper is to provide a review of literature on the role of Web 2.0 or social software tools in education. Design/methodology/approach: This paper is a critical and comprehensive review of a range of literature sources (until January 2009) addressing the various issues related to the educator's perspective of pedagogical…

  8. Proposing a Mathematical Software Tool in Physics Secondary Education

    ERIC Educational Resources Information Center

    Baltzis, Konstantinos B.

    2009-01-01

    MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…

  9. THE ATMOSPHERIC MODEL EVALUATION TOOL

    EPA Science Inventory

    This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...

  10. THE ATMOSPHERIC MODEL EVALUATION TOOL

    EPA Science Inventory

    This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...

  11. Software tools for visualizing Hi-C data.

    PubMed

    Yardımcı, Galip Gürkan; Noble, William Stafford

    2017-02-03

    High-throughput assays for measuring the three-dimensional (3D) configuration of DNA have provided unprecedented insights into the relationship between DNA 3D configuration and function. Data interpretation from assays such as ChIA-PET and Hi-C is challenging because the data is large and cannot be easily rendered using standard genome browsers. An effective Hi-C visualization tool must provide several visualization modes and be capable of viewing the data in conjunction with existing, complementary data. We review five software tools that do not require programming expertise. We summarize their complementary functionalities, and highlight which tool is best equipped for specific tasks.

  12. Computerized nursing staffing: a software evaluation.

    PubMed

    Pereira, Irene Mari; Gaidzinski, Raquel Rapone; Fugulin, Fernanda Maria Togeiro; Peres, Heloísa Helena Ciqueto; Lima, Antônio Fernandes Costa; Castilho, Valéria; Mira, Vera Lúcia; Massarollo, Maria Cristina Komatsu Braga

    2011-12-01

    The complexity involved in operationalizing the method for nursing staffing, in view of the uncountable variable related to identifying the workload, the effective working time of the staff, and the Technical Security Index (TSI) revealed the need to develop a software program named: Computerized Nursing Staffing (DIPE, in Portuguese acronyms). This exploratory, descriptive study was performed with the objective to evaluate the technical quality and functional performance of DIPE. Participants were eighteen evaluators, ten of whom where nurse faculty or nurse hospital unit managers, and eight health informatics experts. The software evaluation was performed according to norm NBR ISO/IEC 9126-1, considering the features functionality, reliability, usability, efficiency, and maintainability. The software evaluation reached positive results and agreement among the evaluators for all the evaluated features. The reported suggestions are important for proposing further improving and enhancing the DIPE.

  13. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  14. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  15. Evaluation of a Human Modeling Software Tool in the Prediction of Extra Vehicular Activity Tasks for an International Space Station Assembly Mission

    NASA Technical Reports Server (NTRS)

    Dischinger, H. Charles; Loughead, Tomas E.

    1997-01-01

    The difficulty of accomplishing work in extravehicular activity (EVA) is well documented. It arises as a result of motion constraints imposed by a pressurized spacesuit in a near-vacuum and of the frictionless environment induced in microgravity. The appropriate placement of foot restraints is crucial to ensuring that astronauts can remove and drive bolts, mate and demate connectors, and actuate levers. The location on structural members of the foot restraint sockets, to which the portable foot restraint is attached, must provide for an orientation of the restraint that affords the astronaut adequate visual and reach envelopes. Previously, the initial location of these sockets was dependent upon the experienced designer's ability to estimate placement. The design was tested in a simulated zero-gravity environment; spacesuited astronauts performed the tasks with mockups while submerged in water. Crew evaluation of the tasks based on these designs often indicated the bolt or other structure to which force needed to be applied was not within an acceptable work envelope, resulting in redesign. The development of improved methods for location of crew aids prior to testing would result in savings to the design effort for EVA hardware. Such an effort to streamline EVA design is especially relevant to International Space Station construction and maintenance. Assembly operations alone are expected to require in excess of four hundred hours of EVA. Thus, techniques which conserve design resources for assembly missions can have significant impact. We describe an effort to implement a human modelling application in the design effort for an International Space Station Assembly Mission. On Assembly Flight 6A, the Canadian-built Space Station Remote Manipulator System will be delivered to the U.S. Laboratory. It will be released from its launch restraints by astronauts in EVA. The design of the placement of foot restraint sockets was carried out using the human model Jack, and

  16. Software design for professional risk evaluation

    NASA Astrophysics Data System (ADS)

    Ionescu, V.; Calea, G.; Amza, G.; Iacobescu, G.; Nitoi, D.; Dimitrescu, A.

    2016-08-01

    Professional risk evaluation represents a complex activity involving each economic operator, with important repercussion upon health and security in work. Article represents an innovative study method, regarding professional risk analyze in which cumulative working posts are evaluated. Work presents a new software that helps in putting together all the working positions from a complex organizational system and analyzing them in order to evaluate the possible risks. Using this software, a multiple analysis can be done like: risk estimation, risk evaluation, estimation of residual risks and finally searching of risk reduction measures.

  17. An Analysis of Adenovirus Genomes Using Whole Genome Software Tools

    PubMed Central

    Mahadevan, Padmanabhan

    2016-01-01

    The evolution of sequencing technology has lead to an enormous increase in the number of genomes that have been sequenced. This is especially true in the field of virus genomics. In order to extract meaningful biological information from these genomes, whole genome data mining software tools must be utilized. Hundreds of tools have been developed to analyze biological sequence data. However, only some of these tools are user-friendly to biologists. Several of these tools that have been successfully used to analyze adenovirus genomes are described here. These include Artemis, EMBOSS, pDRAW, zPicture, CoreGenes, GeneOrder, and PipMaker. These tools provide functionalities such as visualization, restriction enzyme analysis, alignment, and proteome comparisons that are extremely useful in the bioinformatics analysis of adenovirus genomes. PMID:28293072

  18. The MEDA Project: Developing Evaluation Competence in the Training Software Domain.

    ERIC Educational Resources Information Center

    Machell, Joan; Saunders, Murray

    1992-01-01

    The MEDA (Methodologie d'Evaluation des Didacticiels pour les Adultes) tool is a generic instrument to evaluate training courseware. It was developed for software designers to improve products, for instructors to select appropriate courseware, and for distributors and consultants to match software to client needs. Describes software evaluation…

  19. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  20. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  1. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  2. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  3. Development of a software tool for an internal dosimetry using MIRD method

    NASA Astrophysics Data System (ADS)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  4. EPA`s evaluation of utility emissions data submitted under Title IV: Application of software tools in an operational data quality assurance program

    SciTech Connect

    Hillock, C.S.; Wockenfuss, M.E.

    1995-12-31

    Title IV (Acid Deposition Control) of the Clean Air Act Amendments of 1990 requires annual reductions of 10 million tons of sulfur dioxide and substantial reductions of nitrogen oxides from electric utilities. These reductions will occur in two phases with Phase 1 beginning in 1995. To ensure the reduction goals are met, affected utilities must monitor their emissions, perform quality assurance and quality control tests, and report the data to EPA as required by 40 CFR Part 75. EPA`s Emissions Tracking System (ETS) was developed to analyze all submitted data reports and to provide the annual emissions data needed to determine whether utilities comply with their allowable SO{sub 2} emissions. EPA received the first quarterly reports from Phase 1 utilities at the end of January, 1994. A substantial number of these initial reports exhibited major problems and EPA required many utilities to improve and resubmit their reports. Data reports for the following three quarters of 1994 showed continual improvement as utilities gained experience in operating and improving their monitoring hardware and software and as EPA clarified published guidance. Helpful utility comments enabled EPA to correct and refine ETS. Phase 2 will affect approximately 700 additional plants. These plants will begin reporting their emissions data to EPA at the beginning of May, 1995. EPA is improving data handling and processing procedures and expanding the ETS software system capabilities to receive and process the substantial increase in submitted data during 1995.

  5. Comparison of quality control software tools for diffusion tensor imaging.

    PubMed

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. An Approach to Evaluate Software Effectiveness.

    DTIC Science & Technology

    1996-12-01

    outlined some basic objectives which were adapted to the objectives listed in Section 1.3 and repeated below: 1) Develop a working definition of...Strategies for Hardware Reconfiguration. Schwab describes reconfiguration strategies for real-time, very large scale integrated circuit processing...controlled conditions, observing the behavior of the software, and making an evaluation 51 about the software. Individual tests can be made up of several test

  7. Ensuring system security through formal software evaluation

    SciTech Connect

    Howell, J A; Fuyat, C; Elvy, M

    1992-01-01

    With the increasing use of computer systems and networks to process safeguards information in nuclear facilities, the issue of system and data integrity is receiving worldwide attention. Among the many considerations are validation that the software performs as intended and that the information is adequately protected. Such validations are often requested of the Safeguards Systems Group of the Los Alamos National Laboratory. This paper describes our methodology for performing these software evaluations.

  8. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  9. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  10. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  11. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  12. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.

    1990-01-01

    A prototype is described that can serve as a scientific-modeling software tool to facilitate the development of useful scientific models. The prototype is developed for applications to planetary modeling, and specific examples are given that relate to the atmosphere of Titan. The scientific modeling tool employs a high-level domain-specific modeling language, several data-display facilities, and a library of experimental datasets and scientific equations. The planetary modeling prototype links uncomputed physical variables to computed variables with computational transformations based on a backchaining procedure. The system - implemented in LISP with an object-oriented knowledge-representation tool - is run on a workstation that provides interface with several models. The prototype is expected to form the basis for a sophisticated modeling tool that can permit active experimentation.

  13. Westinghouse Waste Simulation and Optimization Software Tool - 13493

    SciTech Connect

    Mennicken, Kim; Aign, Joerg

    2013-07-01

    Radioactive waste is produced during NPP operation and NPP D and D. Different kinds of waste with different volumes and properties have to be treated. Finding a technically and commercially optimized waste treatment concept is a difficult and time consuming process. The Westinghouse waste simulation and optimization software tool is an approach to study the total life cycle cost of any waste management facility. The tool enables the user of the simulation and optimization software to plan processes and storage buildings and to identify bottlenecks in the overall waste management design before starting detailed planning activities. Furthermore, application of the software enables the user to optimize the number of treatment systems, to determine the minimum design capacity for onsite storage facilities, to identify bottlenecks in the overall design and to identify the most cost-effective treatment paths by maintaining optimal waste treatment technologies. In combination with proven waste treatment equipment and integrated waste management solutions, the waste simulation and optimization software provides reliable qualitative results that lead to an effective planning and minimization of the total project planning risk of any waste management activity. (authors)

  14. Thermography based prescreening software tool for veterinary clinics

    NASA Astrophysics Data System (ADS)

    Dahal, Rohini; Umbaugh, Scott E.; Mishra, Deependra; Lama, Norsang; Alvandipour, Mehrdad; Umbaugh, David; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Under development is a clinical software tool which can be used in the veterinary clinics as a prescreening tool for these pathologies: anterior cruciate ligament (ACL) disease, bone cancer and feline hyperthyroidism. Currently, veterinary clinical practice uses several imaging techniques including radiology, computed tomography (CT), and magnetic resonance imaging (MRI). But, harmful radiation involved during imaging, expensive equipment setup, excessive time consumption and the need for a cooperative patient during imaging, are major drawbacks of these techniques. In veterinary procedures, it is very difficult for animals to remain still for the time periods necessary for standard imaging without resorting to sedation - which creates another set of complexities. Therefore, clinical application software integrated with a thermal imaging system and the algorithms with high sensitivity and specificity for these pathologies, can address the major drawbacks of the existing imaging techniques. A graphical user interface (GUI) has been created to allow ease of use for the clinical technician. The technician inputs an image, enters patient information, and selects the camera view associated with the image and the pathology to be diagnosed. The software will classify the image using an optimized classification algorithm that has been developed through thousands of experiments. Optimal image features are extracted and the feature vector is then used in conjunction with the stored image database for classification. Classification success rates as high as 88% for bone cancer, 75% for ACL and 90% for feline hyperthyroidism have been achieved. The software is currently undergoing preliminary clinical testing.

  15. Software Certification for Temporal Properties With Affordable Tool Qualification

    NASA Technical Reports Server (NTRS)

    Xia, Songtao; DiVito, Benedetto L.

    2005-01-01

    It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.

  16. Software Tools for Measuring and Calculating Electromagnetic Shielding Effectiveness

    DTIC Science & Technology

    2005-09-01

    ARMY RSRCH LAB ATTN AMSRD- ARL -CI-OK-T TECHL PUB (2 COPIES) ATTN AMSRD- ARL -CI-OK-TL TECHL LIB (2 COPIES) ATTN AMSRD- ARL -D J M MILLER ATTN...Software Tools for Measuring and Calculating Electromagnetic Shielding Effectiveness by Neal Tesny ARL -TR-3645 September 2005...report when it is no longer needed. Do not return it to the originator. Army Research Laboratory Adelphi, MD 20783-1197 ARL -TR-3645 September

  17. Management of an affiliated Physics Residency Program using a commercial software tool.

    PubMed

    Zacarias, Albert S; Mills, Michael D

    2010-06-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.

  18. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.

    PubMed

    Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A

    2016-09-01

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples.

  19. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  20. Evaluation as a Learning Tool

    ERIC Educational Resources Information Center

    Feinstein, Osvaldo Nestor

    2012-01-01

    Evaluation of programs or projects is often perceived as a threat. This is to a great extent related to the anticipated use of evaluation for accountability, which is often prioritized at the expense of using evaluation as a learning tool. Frequently it is argued that there is a trade-off between these two evaluation functions. An alternative…

  1. Evaluation as a Learning Tool

    ERIC Educational Resources Information Center

    Feinstein, Osvaldo Nestor

    2012-01-01

    Evaluation of programs or projects is often perceived as a threat. This is to a great extent related to the anticipated use of evaluation for accountability, which is often prioritized at the expense of using evaluation as a learning tool. Frequently it is argued that there is a trade-off between these two evaluation functions. An alternative…

  2. Creating an automated tool for measuring software cohesion

    SciTech Connect

    Tutton, J.M.; Zucconi, L.

    1994-05-06

    Program modules with high complexity tend to be more error prone and more difficult to understand. These factors increase maintenance and enhancement costs. Hence, a tool that can help programmers determine a key factor in module complexity should be very useful. Our goal is to create a software tool that will automatically give a quantitative measure of the cohesiveness of a given module, and hence give us an estimate of the {open_quotes}maintainability{close_quotes} of that module. The Tool will use a metric developed by Professors Linda M. Ott and James M. Bieman. The Ott/Bieman metric gives quantitative measures that indicate the degree of functional cohesion using abstract data slices.

  3. Classroom Live: a software-assisted gamification tool

    NASA Astrophysics Data System (ADS)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  4. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  5. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  6. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  7. Empirical Software Evaluation: A Practical Alternative.

    ERIC Educational Resources Information Center

    Hedbring, Charles

    1987-01-01

    The article presents a software evaluation checklist developed by a teaching-research laboratory for severely handicapped students in New York City. In an introductory section, the use of laptop microcomputers in helping handicapped learners acquire, maintain, and generalize functional skills is described as the fifth ingredient of an integrated…

  8. Generating and Evaluating Software Product Ideas.

    ERIC Educational Resources Information Center

    Coyne, John P.

    1989-01-01

    Ten ways to evaluate new software product ideas are presented, such as talking with computer user groups and advertising the product before development to determine consumer interest. Ten methods for generating new product ideas are also offered, including reading material on the fringe of one's work and soliciting opinions of potential clients.…

  9. The virtual clinical evaluation tool.

    PubMed

    Sander, Rebecca; Trible, Karen A

    2008-01-01

    Two years ago, faculty and students at this rural university setting collaborated to implement a virtual clinical evaluation tool. In recognition of the frustrations involved in coordinating instructor and student input to a hard copy tool, a virtual clinical evaluation tool was created in the form of an Excel spreadsheet. Excel documents have the advantage of immediate retrieval and use by instructors or students, ease of narration by word processing, automatic mathematical computation of formative and summative scores, and data storage through computer archives. Using the online Blackboard course, students and instructors are able to collaboratively input a Likert score for each posted evaluation outcome and word process-related comments about students' clinical performance. An overview of the 2-year implementation of this virtual clinical evaluation tool, as well as the evaluation process, is discussed.

  10. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  11. SuperTools Test and Evaluation Plan

    SciTech Connect

    Mannos, Tom J.

    2017-01-01

    Superconducting electronics (SCE) represents a potential path to efficient exascale computing for HPC and data center applications, but SCE-based circuit design lags far behind its CMOS equivalent. IARPA’s ongoing C3 program and its developing SuperTools program aim to jumpstart SCE R&D with the near-term goal of producing a high-speed, low-energy, 64-bit RISC processor using Josephson Junction based logic cells. SuperTools performers will develop software tools for efficient SCE design and accurate simulation and characterization of JJ-based circuits, which include the RSFQ, RQL, and AQFP logic families. T&E teams from NIST, MIT Lincoln Lab, Berkeley Lab, and Sandia National Labs will evaluate the tools and fabricate test circuits to compare with simulated results. The five-year, three-phase program includes 48 performer deliverables, three annual technical exchange meetings, and annual site visits.

  12. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  13. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  14. Microcomputers: Software Evaluation. Evaluation Guides. Guide Number 17.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    This guide discusses three critical steps in selecting microcomputer software and hardware: setting the context, software evaluation, and managing microcomputer use. Specific topics addressed include: (1) conducting an informal task analysis to determine how the potential user's time is spent; (2) identifying tasks amenable to computerization and…

  15. Identification and evaluation of software measures

    NASA Technical Reports Server (NTRS)

    Card, D. N.

    1981-01-01

    A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.

  16. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  17. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  18. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  19. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  20. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  1. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface.

  2. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  3. Software tool for 3D extraction of germinal centers.

    PubMed

    Olivieri, David N; Escalona, Merly; Faro, Jose

    2013-01-01

    Germinal Centers (GC) are short-lived micro-anatomical structures, within lymphoid organs, where affinity maturation is initiated. Theoretical modeling of the dynamics of the GC reaction including follicular CD4+ T helper and the recently described follicular regulatory CD4+ T cell populations, predicts that the intensity and life span of such reactions is driven by both types of T cells, yet controlled primarily by follicular regulatory CD4+ T cells. In order to calibrate GC models, it is necessary to properly analyze the kinetics of GC sizes. Presently, the estimation of spleen GC volumes relies upon confocal microscopy images from 20-30 slices spanning a depth of ~ 20 - 50 μm, whose GC areas are analyzed, slice-by-slice, for subsequent 3D reconstruction and quantification. The quantity of data to be analyzed from such images taken for kinetics experiments is usually prohibitively large to extract semi-manually with existing software. As a result, the entire procedure is highly time-consuming, and inaccurate, thereby motivating the need for a new software tool that can automatically identify and calculate the 3D spot volumes from GC multidimensional images. We have developed pyBioImage, an open source cross platform image analysis software application, written in python with C extensions that is specifically tailored to the needs of immunologic research involving 4D imaging of GCs. The software provides 1) support for importing many multi-image formats, 2) basic image processing and analysis, and 3) the ExtractGC module, that allows for automatic analysis and visualization of extracted GC volumes from multidimensional confocal microscopy images. We present concrete examples of different microscopy image data sets of GC that have been used in experimental and theoretical studies of mouse model GC dynamics. The pyBioImage software framework seeks to be a general purpose image application for immunological research based on 4D imaging. The ExtractGC module uses a

  4. An Evaluation of Software Cost Estimating Models.

    DTIC Science & Technology

    1981-06-01

    EVALUATION OF SOFTWARE COST ESTIMATING Sep 73- Oct 79 MODELS. R14- --. R IOTNME 7. AUTHOR (.) * ce.4 **CT OR GRANT NUMBER(C.’ * ~ Robert Thibodeau K 1 F30602...review of the draft DCP begins, the program can be terminated with the approval of the highest command level which authorized it. Once DSARC review begins...concert with many other elements. Initially, we might speak of the navigation subsystem and its functions. Later, we would describe the alignment element

  5. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  6. Preliminary evaluation of the publicly available Laboratory for Breast Radiodensity Assessment (LIBRA) software tool: comparison of fully automated area and volumetric density measures in a case-control study with digital mammography.

    PubMed

    Keller, Brad M; Chen, Jinbo; Daye, Dania; Conant, Emily F; Kontos, Despina

    2015-08-25

    Breast density, commonly quantified as the percentage of mammographically dense tissue area, is a strong breast cancer risk factor. We investigated associations between breast cancer and fully automated measures of breast density made by a new publicly available software tool, the Laboratory for Individualized Breast Radiodensity Assessment (LIBRA). Digital mammograms from 106 invasive breast cancer cases and 318 age-matched controls were retrospectively analyzed. Density estimates acquired by LIBRA were compared with commercially available software and standard Breast Imaging-Reporting and Data System (BI-RADS) density estimates. Associations between the different density measures and breast cancer were evaluated by using logistic regression after adjustment for Gail risk factors and body mass index (BMI). Area under the curve (AUC) of the receiver operating characteristic (ROC) was used to assess discriminatory capacity, and odds ratios (ORs) for each density measure are provided. All automated density measures had a significant association with breast cancer (OR = 1.47-2.23, AUC = 0.59-0.71, P < 0.01) which was strengthened after adjustment for Gail risk factors and BMI (OR = 1.96-2.64, AUC = 0.82-0.85, P < 0.001). In multivariable analysis, absolute dense area (OR = 1.84, P < 0.001) and absolute dense volume (OR = 1.67, P = 0.003) were jointly associated with breast cancer (AUC = 0.77, P < 0.01), having a larger discriminatory capacity than models considering the Gail risk factors alone (AUC = 0.64, P < 0.001) or the Gail risk factors plus standard area percent density (AUC = 0.68, P = 0.01). After BMI was further adjusted for, absolute dense area retained significance (OR = 2.18, P < 0.001) and volume percent density approached significance (OR = 1.47, P = 0.06). This combined area-volume density model also had a significantly (P < 0.001) improved discriminatory capacity (AUC = 0.86) relative to a model considering the Gail risk factors plus BMI (AUC = 0

  7. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    SciTech Connect

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  8. Evaluation of Computer Software for Use in the Classroom.

    ERIC Educational Resources Information Center

    Johnson, William E.

    To help teachers cope with the proliferation of software and software sources, a number of resources are available to aid in the evaluation and selection of educational software. For instance, both the "Educator's Handbook and Software Directory" and "Swift's Directory of Educational Software, Apple II Edition" provide listings…

  9. Software for evaluation of EPR-dosimetry performance.

    PubMed

    Shishkina, E A; Timofeev, Yu S; Ivanov, D V

    2014-06-01

    Electron paramagnetic resonance (EPR) with tooth enamel is a method extensively used for retrospective external dosimetry. Different research groups apply different equipment, sample preparation procedures and spectrum processing algorithms for EPR dosimetry. A uniform algorithm for description and comparison of performances was designed and implemented in a new computer code. The aim of the paper is to introduce the new software 'EPR-dosimetry performance'. The computer code is a user-friendly tool for providing a full description of method-specific capabilities of EPR tooth dosimetry, from metrological characteristics to practical limitations in applications. The software designed for scientists and engineers has several applications, including support of method calibration by evaluation of calibration parameters, evaluation of critical value and detection limit for registration of radiation-induced signal amplitude, estimation of critical value and detection limit for dose evaluation, estimation of minimal detectable value for anthropogenic dose assessment and description of method uncertainty.

  10. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  11. A software tool for 3D dose verification and analysis

    NASA Astrophysics Data System (ADS)

    Sa'd, M. Al; Graham, J.; Liney, G. P.

    2013-06-01

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  12. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  13. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  14. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  15. SPIRou @ CFHT: data reduction software and simulation tools

    NASA Astrophysics Data System (ADS)

    Artigau, Étienne; Bouchy, François; Delfosse, Xavier; Bonfils, Xavier; Donati, Jean-François; Figueira, Pedro; Thanjavur, Karun; Lafrenière, David; Doyon, René; Surace, Christian; Moutou, Claire; Boisse, Isabelle; Saddlemyer, Leslie; Loop, David; Kouach, Driss; Pepe, Francesco; Lovis, Christophe; Hernandez, Olivier; Wang, Shiang-Yu

    2012-09-01

    SPIRou is a near-infrared, echelle spectropolarimeter/velocimeter under design for the 3.6m Canada-France- Hawaii Telescope (CFHT) on Mauna Kea, Hawaii. The unique scientific capabilities and technical design features are described in the accompanying papers at this conference. In this paper we focus on the data reduction software (DRS) and the data simulation tool. The SPIRou DRS builds upon the experience of the existing SOPHIE, HARPS and ESPADONS spectrographs; class-leaders instruments for high-precision RV measurements and spectropolarimetry. While SPIRou shares many characteristics with these instruments, moving to the near- infrared domain brings specific data-processing challenges: the presence of a large number of telluric absorption lines, strong emission sky lines, thermal background, science arrays with poorer cosmetics, etc. In order for the DRS to be fully functional for SPIRou's first light in 2015, we developed a data simulation tool that incorporates numerous instrumental and observational e_ects. We present an overview of the DRS and the simulation tool architectures.

  16. Early Results from Characterizing Verification Tools Through Coding Error Candidates Reported in Space Flight Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Fischer, Anton; Pinto, Mario; Prause, Christian R.

    2016-08-01

    Six software verification tools have been applied to space flight software and the findings reported by each tool have been compared in order to derive footprints of the tools regarding capabilities of fault identification. Currently available results are provided in this paper: sensitivity and precision of individual tools and combinations of pairs of tools out of the set. A reader should bear in mind that the results as presented here depend on the spectrum of fault types as present in the reference software and on the configuration of tools towards real defects and fault types which are of interest for embedded systems and space flight software.

  17. User Guide for the STAYSL PNNL Suite of Software Tools

    SciTech Connect

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  18. Utilizing Spectroscopic Research Tools and Software in the Classroom

    NASA Astrophysics Data System (ADS)

    Grubbs, G. S., II

    2015-06-01

    Given today's technological age, it has become crucial to be able to reach the student in a more ''tech-savvy" way than traditional classroom methods afford. Given this, there are already a vast range of software packages available to the molecular spectroscopist that can easily be introduced to the classroom with success. This talk will highlight taking a few of these tools (Gaussian09, SPFIT/SPCAT, the AABS Package, LabViewTM, etc.) and implementing them in the classroom to teach subjects such as Quantum Mechanics and Thermodynamics as well as to aid in the linkage between these subjects. Examples of project implementation on both undergraduate and graduate level students will be presented with a discussion on the successes and failures of such attempts.

  19. EVALUATING ENVIRONMENTAL DECISION SUPPORT TOOLS.

    SciTech Connect

    SULLIVAN, T.

    2004-10-01

    Effective contaminated land management requires a number of decisions addressing a suite of technical, economic, and social concerns. These concerns include human health risks, ecological risks, economic costs, technical feasibility of proposed remedial actions, and the value society places on clean-up and re-use of formerly contaminated lands. Decision making, in the face of uncertainty and multiple and often conflicting objectives, is a vital and challenging role in environmental management that affects a significant economic activity. Although each environmental remediation problem is unique and requires a site-specific analysis, many of the key decisions are similar in structure. This has led many to attempt to develop standard approaches. As part of the standardization process, attempts have been made to codify specialist expertise into decision support tools. This activity is intended to facilitate reproducible and transparent decision making. The process of codifying procedures has also been found to be a useful activity for establishing and rationalizing management processes. This study will have two primary objectives. The first is to develop taxonomy for Decision Support Tools (DST) to provide a framework for understanding the different tools and what they are designed to address in the context of environmental remediation problems. The taxonomy will have a series of subject areas for the DST. From these subjects, a few key areas will be selected for further study and software in these areas will be identified. The second objective, will be to review the existing DST in the selected areas and develop a screening matrix for each software product.

  20. Talkoot: software tool to create collaboratories for earth science

    SciTech Connect

    Movva, Sunil; Ramachandran, Rahul; Maskey, Manil; Kulkarni, Ajinkya; Conover, Helen; Nair, U.S.

    2012-01-01

    Open science, where researchers share and publish every element of their research process in addition to the final results, can foster novel ways of collaboration among researchers and has the potential to spontaneously create new virtual research collaborations. Based on scientific interest, these new virtual research collaborations can cut across traditional boundaries such as institutions and organizations. Advances in technology allow for software tools that can be used by different research groups and institutions to build and support virtual collaborations and infuse open science. This paper describes Talkoot, a software toolkit designed and developed by the authors to provide Earth Science researchers a ready-to-use knowledge management environment and an online platform for collaboration. Talkoot allows Earth Science researchers a means to systematically gather, tag and share their data, analysis workflows and research notes. These Talkoot features are designed to foster rapid knowledge sharing within a virtual community. Talkoot can be utilized by small to medium sized groups and research centers, as well as large enterprises such a national laboratories and federal agencies.

  1. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  2. Wind Evaluation Breadboard electronics and software

    NASA Astrophysics Data System (ADS)

    Núñez, Miguel; Reyes, Marcos; Viera, Teodora; Zuluaga, Pablo

    2008-07-01

    WEB, the Wind Evaluation Breadboard, is an Extremely Large Telescope Primary Mirror simulator, developed with the aim of quantifying the ability of a segmented primary mirror to cope with wind disturbances. This instrument supported by the European Community (Framework Programme 6, ELT Design Study), is developed by ESO, IAC, MEDIA-ALTRAN, JUPASA and FOGALE. The WEB is a bench of about 20 tons and 7 meter diameter emulating a segmented primary mirror and its cell, with 7 hexagonal segments simulators, including electromechanical support systems. In this paper we present the WEB central control electronics and the software development which has to interface with: position actuators, auxiliary slave actuators, edge sensors, azimuth ring, elevation actuator, meteorological station and air balloons enclosure. The set of subsystems to control is a reduced version of a real telescope segmented primary mirror control system with high real time performance but emphasizing on development time efficiency and flexibility, because WEB is a test bench. The paper includes a detailed description of hardware and software, paying special attention to real time performance. The Hardware is composed of three computers and the Software architecture has been divided in three intercommunicated applications and they have been implemented using Labview over Windows XP and Pharlap ETS real time operating system. The edge sensors and position actuators close loop has a sampling and commanding frequency of 1KHz.

  3. Evaluation of the DDSolver software applications.

    PubMed

    Zuo, Jieyu; Gao, Yuan; Bou-Chacra, Nadia; Löbenberg, Raimar

    2014-01-01

    When a new oral dosage form is developed, its dissolution behavior must be quantitatively analyzed. Dissolution analysis involves a comparison of the dissolution profiles and the application of mathematical models to describe the drug release pattern. This report aims to assess the application of the DDSolver, an Excel add-in software package, which is designed to analyze data obtained from dissolution experiments. The data used in this report were chosen from two dissolution studies. The results of the DDSolver analysis were compared with those obtained using an Excel worksheet. The comparisons among three different products obtained similarity factors (f 2) of 23.21, 46.66, and 17.91 using both DDSolver and the Excel worksheet. The results differed when DDSolver and Excel were used to calculate the release exponent "n" in the Korsmeyer-Peppas model. Performing routine quantitative analysis proved to be much easier using the DDSolver program than an Excel spreadsheet. The use of the DDSolver program reduced the calculation time and has the potential to omit calculation errors, thus making this software package a convenient tool for dissolution comparison.

  4. The Software Line-up: What Reviewers Look for When Evaluating Software.

    ERIC Educational Resources Information Center

    ELECTRONIC Learning, 1982

    1982-01-01

    Contains a check list to aid teachers in evaluating software used in computer-assisted instruction on microcomputers. The evaluation form contains three sections: program description, program evaluation, and overall evaluation. A brief description of a software evaluation program in use at the Granite School District in Utah is included. (JJD)

  5. A software tool for removing patient identifying information from clinical documents.

    PubMed

    Friedlin, F Jeff; McDonald, Clement J

    2008-01-01

    We created a software tool that accurately removes all patient identifying information from various kinds of clinical data documents, including laboratory and narrative reports. We created the Medical De-identification System (MeDS), a software tool that de-identifies clinical documents, and performed 2 evaluations. Our first evaluation used 2,400 Health Level Seven (HL7) messages from 10 different HL7 message producers. After modifying the software based on the results of this first evaluation, we performed a second evaluation using 7,190 pathology report HL7 messages. We compared the results of MeDS de-identification process to a gold standard of human review to find identifying strings. For both evaluations, we calculated the number of successful scrubs, missed identifiers, and over-scrubs committed by MeDS and evaluated the readability and interpretability of the scrubbed messages. We categorized all missed identifiers into 3 groups: (1) complete HIPAA-specified identifiers, (2) HIPAA-specified identifier fragments, (3) non-HIPAA-specified identifiers (such as provider names and addresses). In the results of the first-pass evaluation, MeDS scrubbed 11,273 (99.06%) of the 11,380 HIPAA-specified identifiers and 38,095 (98.26%) of the 38,768 non-HIPAA-specified identifiers. In our second evaluation (status postmodification to the software), MeDS scrubbed 79,993 (99.47%) of the 80,418 HIPAA-specified identifiers and 12,689 (96.93%) of the 13,091 non-HIPAA-specified identifiers. Approximately 95% of scrubbed messages were both readable and interpretable. We conclude that MeDS successfully de-identified a wide range of medical documents from numerous sources and creates scrubbed reports that retain their interpretability, thereby maintaining their usefulness for research.

  6. Software Development Outsourcing Decision Support Tool with Neural Network Learning

    DTIC Science & Technology

    2004-03-01

    software domain, enterprise scripting software domain, and outsourcing ( maintenance and training) processes found to be included in the new model but not in...accounting and order entry) software domains, and outsourcing ( maintenance , configuration management and software engineer support) processes were...found in the original model but not in the new model included: enterprise (scripting and order entry) software domains and outsourcing maintenance process

  7. A software tool for rapid flood inundation mapping

    USGS Publications Warehouse

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  8. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  9. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  10. Energy efficiency assessment methods and tools evaluation

    SciTech Connect

    McMordie, K.L.; Richman, E.E.; Keller, J.M.; Dixon, D.R.

    1994-08-01

    Many different methods of assessing the energy savings potential at federal installations, and identifying attractive projects for capital investment have been used by the different federal agencies. These methods range from high-level estimating tools to detailed design tools, both manual and software assisted. These methods have different purposes and provide results that are used for different parts of the project identification, and implementation process. Seven different assessment methods are evaluated in this study. These methods were selected by the program managers at the DoD Energy Policy Office, and DOE Federal Energy Management Program (FEMP). Each of the methods was applied to similar buildings at Bolling Air Force Base (AFB), unless it was inappropriate or the method was designed to make an installation-wide analysis, rather than focusing on particular buildings. Staff at Bolling AFB controlled the collection of data.

  11. Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.

    ERIC Educational Resources Information Center

    Lovell, Ashley C., Comp.

    Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…

  12. Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.

    ERIC Educational Resources Information Center

    Lovell, Ashley C., Comp.

    Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…

  13. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  14. New CFD tools to evaluate nasal airflow.

    PubMed

    Burgos, M A; Sanmiguel-Rojas, E; Del Pino, C; Sevilla-García, M A; Esteban-Ortega, F

    2017-08-01

    Computational fluid dynamics (CFD) is a mathematical tool to analyse airflow. As currently CFD is not a usual tool for rhinologists, a group of engineers in collaboration with experts in Rhinology have developed a very intuitive CFD software. The program MECOMLAND(®) only required snapshots from the patient's cross-sectional (tomographic) images, being the output those results originated by CFD, such as airflow distributions, velocity profiles, pressure, temperature, or wall shear stress. This is useful complementary information to cover diagnosis, prognosis, or follow-up of nasal pathologies based on quantitative magnitudes linked to airflow. In addition, the user-friendly environment NOSELAND(®) helps the medical assessment significantly in the post-processing phase with dynamic reports using a 3D endoscopic view. Specialists in Rhinology have been asked for a more intuitive, simple, powerful CFD software to offer more quality and precision in their work to evaluate the nasal airflow. We present MECOMLAND(®) and NOSELAND(®) which have all the expected characteristics to fulfil this demand and offer a proper assessment with the maximum of quality plus safety for the patient. These programs represent a non-invasive, low-cost (as the CT scan is already performed in every patient) alternative for the functional study of the difficult rhinologic case. To validate the software, we studied two groups of patients from the Ear Nose Throat clinic, a first group with normal noses and a second group presenting septal deviations. Wall shear stresses are lower in the cases of normal noses in comparison with those for septal deviation. Besides, velocity field distributions, pressure drop between nasopharynx and the ambient, and flow rates in each nostril were different among the nasal cavities in the two groups. These software modules open up a promising future to simulate the nasal airflow behaviour in virtual surgery intervention scenarios under different pressure or

  15. Dental students' evaluations of an interactive histology software.

    PubMed

    Rosas, Cristian; Rubí, Rafael; Donoso, Manuel; Uribe, Sergio

    2012-11-01

    This study assessed dental students' evaluations of a new Interactive Histology Software (IHS) developed by the authors and compared students' assessment of the extent to which this new software, as well as other histology teaching methods, supported their learning. The IHS is a computer-based tool for histology learning that presents high-resolution images of histology basics as well as specific oral histologies at different magnifications and with text labels. Survey data were collected from 204 first-year dental students at the Universidad Austral de Chile. The survey consisted of questions for the respondents to evaluate the characteristics of the IHS and the contribution of various teaching methods to their histology learning. The response rate was 85 percent. Student evaluations were positive for the design, usability, and theoretical-practical integration of the IHS, and the students reported they would recommend the method to future students. The students continued to value traditional teaching methods for histological lab work and did not think this new technology would replace traditional methods. With respect to the contribution of each teaching method to students' learning, no statistically significant differences (p>0.05) were found for an evaluation of IHS, light microscopy, and slide presentations. However, these student assessments were significantly more positive than the evaluations of other digital or printed materials. Overall, the students evaluated the IHS very positively in terms of method quality and contribution to their learning; they also evaluated use of light microscopy and teacher slide presentations positively.

  16. Decision graphs: a tool for developing real-time software

    SciTech Connect

    Kozubal, A.J.

    1981-01-01

    The use of decision graphs in the preparation of, in particular, real-time software is briefly described. The usefulness of decision graphs in software design, testing, and maintenance is pointed out. 2 figures. (RWR)

  17. Teaching Undergraduate Software Engineering Using Open Source Development Tools

    DTIC Science & Technology

    2012-01-01

    on Computer Science Education (SIGCSE 󈧏), 153- 158. Pandey, R. (2009). Exploiting web resources for teaching /learning best software design tips...Issues in Informing Science and Information Technology Volume 9, 2012 Teaching Undergraduate Software Engineering Using Open Source Development...multi-course sequence, to teach students both the theoretical concepts of soft- ware development as well as the practical aspects of developing software

  18. BEASTling: A software tool for linguistic phylogenetics using BEAST 2

    PubMed Central

    Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784

  19. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, G.; Ferguson, B. L.

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, Task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results, and were generated using a Yates analysis. A previous study has indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledge base and developing a procedure to extract rules to aid in TBC design.

  20. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, Gregory; Ferguson, B. Lynn

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results and were generated using a Yates analysis. A previous study had indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledgebase and developing a procedure to extract rules to aid in TBC design.

  1. BEASTling: A software tool for linguistic phylogenetics using BEAST 2.

    PubMed

    Maurits, Luke; Forkel, Robert; Kaiping, Gereon A; Atkinson, Quentin D

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts.

  2. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    NASA Astrophysics Data System (ADS)

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  3. The Educational Software Design and Evaluation for K-8: Oral and Dental Health Software

    ERIC Educational Resources Information Center

    Kabakci, Isil; Birinci, Gurkay; Izmirli, Serkan

    2007-01-01

    The aim of this study is to inform about the development of the software "Oral and Dental Health" that will supplement the course of Science and Technology for K8 students in the primary school curriculum and to carry out an evaluation study of the software. This software has been prepared for educational purposes. In relation to the…

  4. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  5. Article Reprints from "The Computing Teacher" on Software Evaluations.

    ERIC Educational Resources Information Center

    International Council for Computers in Education, Eugene, OR.

    Reprinted from the "The Computing Teacher," this collection of nine articles presents information on computer software selection and evaluation. The articles include: (1) "The DISC Model for Software Evaluation and Support Material Design" (Shelley Yorke Rose and Carol Klenow); (2) "Selecting Computer Software--We Take It…

  6. Article Reprints from "The Computing Teacher" on Software Evaluations.

    ERIC Educational Resources Information Center

    International Council for Computers in Education, Eugene, OR.

    Reprinted from the "The Computing Teacher," this collection of nine articles presents information on computer software selection and evaluation. The articles include: (1) "The DISC Model for Software Evaluation and Support Material Design" (Shelley Yorke Rose and Carol Klenow); (2) "Selecting Computer Software--We Take It…

  7. Research on software behavior trust based on hierarchy evaluation

    NASA Astrophysics Data System (ADS)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  8. Evaluation of Optical Disk Jukebox Software.

    ERIC Educational Resources Information Center

    Ranade, Sanjay; Yee, Fonald

    1989-01-01

    Discusses software that is used to drive and access optical disk jukeboxes, which are used for data storage. Categories of the software are described, user categories are explained, the design of implementation approaches is discussed, and representative software products are reviewed. (eight references) (LRW)

  9. Evaluating Accounting Software in Secondary Schools.

    ERIC Educational Resources Information Center

    Chalupa, Marilyn

    1988-01-01

    The secondary accounting curriculum must be modified to include computers and software. Educators must be aware of the computer skills needed on the job and of the accounting software that is available. Software selection must be tailored to fit the curriculum and the time available. (JOW)

  10. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  11. FACET: Future ATM Concepts Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Bilmoria, Karl D.; Banavar, Sridhar; Chatterji, Gano B.; Sheth, Kapil S.; Grabbe, Shon

    2000-01-01

    FACET (Future ATM Concepts Evaluation Tool) is an Air Traffic Management research tool being developed at the NASA Ames Research Center. This paper describes the design, architecture and functionalities of FACET. The purpose of FACET is to provide E simulation environment for exploration, development and evaluation of advanced ATM concepts. Examples of these concepts include new ATM paradigms such as Distributed Air-Ground Traffic Management, airspace redesign and new Decision Support Tools (DSTs) for controllers working within the operational procedures of the existing air traffic control system. FACET is currently capable of modeling system-wide en route airspace operations over the contiguous United States. Airspace models (e.g., Center/sector boundaries, airways, locations of navigation aids and airports) are available from databases. A core capability of FACET is the modeling of aircraft trajectories. Using round-earth kinematic equations, aircraft can be flown along flight plan routes or great circle routes as they climb, cruise and descend according to their individual aircraft-type performance models. Performance parameters (e.g., climb/descent rates and speeds, cruise speeds) are obtained from data table lookups. Heading, airspeed and altitude-rate dynamics are also modeled. Additional functionalities will be added as necessary for specific applications. FACET software is written in Java and C programming languages. It is platform-independent, and can be run on a variety of computers. FACET has been designed with a modular software architecture to enable rapid integration of research prototype implementations of new ATM concepts. There are several advanced ATM concepts that are currently being implemented in FACET airborne separation assurance, dynamic density predictions, airspace redesign (re-sectorization), benefits of a controller DST for direct-routing, and the integration of commercial space transportation system operations into the U.S. National

  12. FACET: Future ATM Concepts Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Bilmoria, Karl D.; Banavar, Sridhar; Chatterji, Gano B.; Sheth, Kapil S.; Grabbe, Shon

    2000-01-01

    FACET (Future ATM Concepts Evaluation Tool) is an Air Traffic Management research tool being developed at the NASA Ames Research Center. This paper describes the design, architecture and functionalities of FACET. The purpose of FACET is to provide E simulation environment for exploration, development and evaluation of advanced ATM concepts. Examples of these concepts include new ATM paradigms such as Distributed Air-Ground Traffic Management, airspace redesign and new Decision Support Tools (DSTs) for controllers working within the operational procedures of the existing air traffic control system. FACET is currently capable of modeling system-wide en route airspace operations over the contiguous United States. Airspace models (e.g., Center/sector boundaries, airways, locations of navigation aids and airports) are available from databases. A core capability of FACET is the modeling of aircraft trajectories. Using round-earth kinematic equations, aircraft can be flown along flight plan routes or great circle routes as they climb, cruise and descend according to their individual aircraft-type performance models. Performance parameters (e.g., climb/descent rates and speeds, cruise speeds) are obtained from data table lookups. Heading, airspeed and altitude-rate dynamics are also modeled. Additional functionalities will be added as necessary for specific applications. FACET software is written in Java and C programming languages. It is platform-independent, and can be run on a variety of computers. FACET has been designed with a modular software architecture to enable rapid integration of research prototype implementations of new ATM concepts. There are several advanced ATM concepts that are currently being implemented in FACET airborne separation assurance, dynamic density predictions, airspace redesign (re-sectorization), benefits of a controller DST for direct-routing, and the integration of commercial space transportation system operations into the U.S. National

  13. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  14. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY...This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the...interoperability and enhanced communication. 15. NUMBER OF PAGES 271 14. SUBJECT TERMS Software Engineering, Computer Science, Management

  15. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  16. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    PubMed

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth.

  17. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  18. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  19. Evaluation of the DSN software methodology

    NASA Technical Reports Server (NTRS)

    Irvine, A. P.; Mckenzie, M.

    1978-01-01

    The effects of the DSN software methodology, as implemented under the DSN Programming System, on the DSN Mark 3 Data Subsystems Implementation Project (MDS) are described. The software methodology is found to provide a markedly increased visibility to management, and to produce software of greater reliability at a small decrease in implementation cost. It is also projected that additional savings will result during the maintenance phase. Documentation support is identified as an area that is receiving further attention.

  20. Evaluation of the DSN software methodology

    NASA Technical Reports Server (NTRS)

    Irvine, A. P.; Mckenzie, M.

    1978-01-01

    The effects of the DSN software methodology, as implemented under the DSN Programming System, on the DSN Mark 3 Data Subsystems Implementation Project (MDS) are described. The software methodology is found to provide a markedly increased visibility to management, and to produce software of greater reliability at a small decrease in implementation cost. It is also projected that additional savings will result during the maintenance phase. Documentation support is identified as an area that is receiving further attention.

  1. SIMPLE: A Prototype Software Fault-Injection Tool

    DTIC Science & Technology

    2002-12-01

    54 1. CSMA/CD Software Description In their network utilization study, Sadiku and Ilyas implemented software to simulate a local area network that...cs2.html]. 2002. 49. Sadiku , M. and Ilyas, M., Simulation of Local Area Networks, Boca Raton, Florida. CRC Press, 1994, pp. 112- 133. 50. Hightower

  2. S2O - A software tool for integrating research data from general purpose statistic software into electronic data capture systems.

    PubMed

    Bruland, Philipp; Dugas, Martin

    2017-01-07

    Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible

  3. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  4. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    PubMed

    Ma, Tianle; Zhang, Aidong

    2016-02-26

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  5. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC`s perspective was ``to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.`` This translated into evaluating how easy it was to port ELROS over CRI`s ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC`s side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI`s goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  6. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC's perspective was to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.'' This translated into evaluating how easy it was to port ELROS over CRI's ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC's side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI's goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  7. Kid Tools: Self-Management, Problem- Solving, Organizational, and Planning Software for Children and Teachers

    ERIC Educational Resources Information Center

    Miller, Kevin J.; Fitzgerald, Gail E.; Koury, Kevin A.; Mitchem, Herine J.; Hollingsead, Candice

    2007-01-01

    This article provides an overview of KidTools, an electronic performance software system designed for elementary and middle school children to use independently on classroom or home computers. The software system contains 30 computerized research-based strategy tools that can be implemented in a classroom or home environment. Through the…

  8. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    PubMed

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  9. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  10. Technology Foundations for Computational Evaluation of Software Security Attributes

    DTIC Science & Technology

    2006-12-01

    Technology Foundations for Computational Evaluation of Software Security Attributes Gwendolyn H. Walton Thomas A. Longstaff Richard C...security attributes to the functional behavior of the software . The emergence of CERT’s new function extraction (FX) technology , unavailable to previous... software meets security requirements if they have been specified in behavioral terms. FX technology prescribes effective means to create and record

  11. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  12. An Approach to Building a Traceability Tool for Software Development

    NASA Technical Reports Server (NTRS)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional

  13. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  14. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  15. Pvarray: A software tool for photovoltaic array design

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1985-01-01

    The application of PVARRAY, a software program for design of photovoltaic arrays are described. Results of sample parametric studies on array configurations are presented. It is concluded that PVARRAY could simulate a variety of configurations.

  16. Air traffic management evaluation tool

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar (Inventor); Sheth, Kapil S. (Inventor); Chatterji, Gano Broto (Inventor); Bilimoria, Karl D. (Inventor); Grabbe, Shon (Inventor); Schipper, John F. (Inventor)

    2010-01-01

    Method and system for evaluating and implementing air traffic management tools and approaches for managing and avoiding an air traffic incident before the incident occurs. The invention provides flight plan routing and direct routing or wind optimal routing, using great circle navigation and spherical Earth geometry. The invention provides for aircraft dynamics effects, such as wind effects at each altitude, altitude changes, airspeed changes and aircraft turns to provide predictions of aircraft trajectory (and, optionally, aircraft fuel use). A second system provides several aviation applications using the first system. These applications include conflict detection and resolution, miles-in trail or minutes-in-trail aircraft separation, flight arrival management, flight re-routing, weather prediction and analysis and interpolation of weather variables based upon sparse measurements.

  17. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  18. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  19. Software-based evaluation of human attractiveness: a pilot study.

    PubMed

    Patzelt, Sebastian B M; Schaible, Leonie K; Stampf, Susanne; Kohal, Ralf J

    2014-11-01

    The difficulty of evaluating esthetics in an unbiased way may be overcome by using automated software applications. The purpose of this study was to assess the use of a smartphone application as an objective tool for evaluating attractiveness and to evaluate its potential in dentistry. Ten white participants (mean age ±SD, 42.1 ±22.6 years) were randomly chosen, and frontal facial pictures of each participant were made. The smartphone application PhotoGenic was used to evaluate the attractiveness of the participants. For comparison, 100 randomly (age>16 years, social environment of the research team) selected raters were asked to evaluate the same participants. The influence of participants' facial expression, age, and sex as well as the raters' age, sex, and occupation was investigated. Statistical analyses (linear mixed models with random intercepts; least square means, 95% confidence interval; P<.05) were implemented. PhotoGenic produced a mean ±SD attractiveness score of 6.4 ±1.2 and the rater group of 4.9 ±1.8 (P<.001; score range, 0-10). Female raters tended to slightly higher attractiveness scores. The participants' sex, facial expression, and age seemed to not be of high relevance; however, the raters' sex and occupation had an impact on the evaluation. PhotoGenic rated the participants' attractiveness with higher scores (more attractive) than did the human raters. Currently, PhotoGenic is not used as an objective evaluation tool for treatment outcomes for dental treatments because the visibility of the teeth (smiling facial expression) has no influence on the evaluation. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  20. Development of Automatic Testing Tool for `Design & Coding Standard' for Railway Signaling Software

    NASA Astrophysics Data System (ADS)

    Hwang, Jong-gyu; Jo, Hyun-jeong

    2009-08-01

    In accordance with the development of recent computer technology, the dependency of railway signaling system on the computer software is being increased further, and accordingly, the testing for the safety and reliability of railway signaling system software became more important. This thesis suggested an automated testing tool for coding rules on this railway signaling system software, and presented its result of implementation. The testing items in the implemented tool had referred to the international standards in relation to the software for railway system and MISRA-C standards. This automated testing tool for railway signaling system can be utilized at the assessment stage for railway signaling system software also, and it is anticipated that it can be utilized usefully at the software development stage also.

  1. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  2. OpenROCS: a software tool to control robotic observatories

    NASA Astrophysics Data System (ADS)

    Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere

    2012-09-01

    We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).

  3. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    PubMed

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems.

  4. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    PubMed

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.

  5. Development of the FITS tools package for multiple software environments

    NASA Technical Reports Server (NTRS)

    Pence, W. D.; Blackburn, J. K.

    1992-01-01

    The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.

  6. Klonos: A Similarity Analysis Based Tool for Software Porting

    SciTech Connect

    and Oscar Hernandez, Wei Ding

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  7. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  8. The Hydromorphological Evaluation Tool (HYMET)

    NASA Astrophysics Data System (ADS)

    Klösch, Mario; Habersack, Helmut

    2017-08-01

    River engineering structures, such as bank protection or bed sills, act as constraints on the hydromorphology of rivers and limit morphodynamic processes. Accordingly, the deviations of a river's morphology from a natural reference condition have been attributed to the degree of artificiality in the observed river section and river restoration works mainly aimed at reducing artificial constraints within the river reach. Less attention has been drawn to alterations of the sediment continuum between sediment production in the river's catchment and downstream river reaches. However, the sediment supply from upstream is strongly reflected in the morphodynamics such as bar formation or the reworking of the riverbed. Any alteration of sediment supply may affect the morphological appearance of a reach and determine its deviation from an undisturbed condition. We introduce the Hydromorphological Evaluation Tool (HYMET), which accounts in a hierarchical procedure for sediment supply and sediment transfer as catchment and river network based preconditions for sustainable morphodynamics in river reaches. At the reach scale, artificiality and the sediment budget are assessed. In contrast to existing evaluation methods for assessing hydromorphological state, no reference condition is needed for determining hydro-morphological alterations. Here, with re-established sediment supply and reduced artificiality, a river reach is expected to develop morphodynamics that approach a morphodynamically and ecologically sustainable condition. Application to the Drau River showed that the alteration of sediment supply strongly affects the hydromorphological condition and thus the evaluation result of a restored reach, indicating the remaining potential for the re-initiation/re-establishment of morphodynamics through catchment-wide restoration plans.

  9. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    SciTech Connect

    Habib, Salman; Roser, Robert

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  10. Learning Content and Software Evaluation and Personalisation Problems

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Serikoviene, Silvija

    2010-01-01

    The paper aims to analyse several scientific approaches how to evaluate, implement or choose learning content and software suitable for personalised users/learners needs. Learning objects metadata customisation method as well as the Method of multiple criteria evaluation and optimisation of learning software represented by the experts' additive…

  11. Computer Software for Teaching Basic Skills to Adults. An Evaluation.

    ERIC Educational Resources Information Center

    Montana State Univ., Bozeman. Center for Community Education.

    This color-coded guide/catalog was prepared as a resource for adult educators through a Montana project that evaluated computer software for teaching basic skills to adults. The guide is divided into three parts. Part I consists of the results of the assessment and evaluation of 119 pieces of software currently being used at 16 adult basic…

  12. A software tool to assist business-process decision-making in the biopharmaceutical industry.

    PubMed

    Mustafa, Mustafa A; Washbrook, John; Lim, Ai Chye; Zhou, Yuhong; Titchener-Hooker, Nigel J; Morton, Philip; Berezenko, Steve; Farid, Suzanne S

    2004-01-01

    Conventionally, software tools for the design of bioprocesses have provided only limited business-related information for decision-making. There is an industrial need to investigate manufacturing options and to gauge the impact of various decisions from economic as well as process perspectives. This paper describes the development and use of a tool to provide an assessment of whole flowsheets by capturing both process and business aspects. The tool is demonstrated by considering the issues concerned when making decisions between two potential flowsheets for a common product. A case study approach is used to compare the process and business benefits of a conventional process route employing packed chromatography beds and an alternative that uses expanded bed adsorption (EBA). The tool allows direct evaluation of the benefits of capital cost reduction and increased yield offered by EBA against penalties of using potentially more expensive EBA matrix with lower lifetimes. Furthermore, the tool provides the ability to gauge the process robustness of each flowsheet option.

  13. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  14. Using Software Development Tools and Practices in Acquisition

    DTIC Science & Technology

    2013-12-01

    Teresa (Nolan, Nor- ton and Co.), Proctor, Larry (Nolan, Norton and Co.), Cordelle, Denis (Cap Gemini Segoti), Fero- tin, Jean-Eloi (Cap Gemini Segoti...Solvay, Jean-Philippe (Cap Gemini Segoti), & Segoti, Jean- Philippe (Cap Gemini Segoti). Software Process Automation: Experiences from the Trenches

  15. Experiments in Chemistry: A Model Science Software Tool.

    ERIC Educational Resources Information Center

    Malone, Diana; Tinker, Robert

    1984-01-01

    Describes "Experiments in Chemistry," in which experiments are performed using software and hardware interfaced to the Apple microcomputer's game paddle port. Experiments include temperature, pH electrode, and EMF (cell potential determinations, oxidation-reduction titrations, and precipitation titrations) investigations. (JN)

  16. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    DTIC Science & Technology

    1991-04-01

    Industrial Reliability Program ABSTRACT: Program is utilized for analysis of industrial equipment for which military requirements are not applicable...Can communicate with other TECNASA software and has British or Portuguese menus. MACHINES: IBM PC POC: TECNASA Attn: Jose L. Barletta Electronica ...termed "performability". Models both repairable and nonrepairable systems. MACHINES: No Data POC: Industrial Technology Institute 20 NAME: METFAC

  17. Experiments in Chemistry: A Model Science Software Tool.

    ERIC Educational Resources Information Center

    Malone, Diana; Tinker, Robert

    1984-01-01

    Describes "Experiments in Chemistry," in which experiments are performed using software and hardware interfaced to the Apple microcomputer's game paddle port. Experiments include temperature, pH electrode, and EMF (cell potential determinations, oxidation-reduction titrations, and precipitation titrations) investigations. (JN)

  18. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  19. A Study of Collaborative Software Development Using Groupware Tools

    ERIC Educational Resources Information Center

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  20. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  1. Software Validation, Verification, and Testing Technique and Tool Reference Guide. Final Report.

    ERIC Educational Resources Information Center

    Powell, Patricia B., Ed.

    Intended as an aid in the selection of software techniques and tools, this document contains three sections: (1) a suggested methodology for the selection of validation, verification, and testing (VVT) techniques and tools; (2) summary matrices by development phase usage, a table of techniques and tools with associated keywords, and an…

  2. Orbit Analysis Tools Software (Version 1.0) User’s Manual

    DTIC Science & Technology

    1993-04-15

    Naval Research Laboratory Washington. DC 20375-5320 AD-A265 012 NRL/MR/8103--93-73071111111111111 ilIl I! f111t l11,!If Orbit Analysis Tools Software ...DATES COVERED April 13, 1993 4. TITLE AND SUBTITLE 6. FUNDING NUMBERS Orbit Analysis Tools Software (Version 1.0) Users Manual 6. AUTHOR(S) Alan S. Hope...fhbxiWn 200 word) A program to perform satellite mission and coverage analysis has been written. The Orbit Analysis Tools Software (OATS) program uses

  3. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  4. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  5. Computer Aided Learning of Mathematics: Software Evaluation

    ERIC Educational Resources Information Center

    Yushau, B.; Bokhari, M. A.; Wessels, D. C. J.

    2004-01-01

    Computer Aided Learning of Mathematics (CALM) has been in use for some time in the Prep-Year Mathematics Program at King Fahd University of Petroleum & Minerals. Different kinds of software (both locally designed and imported) have been used in the quest of optimizing the recitation/problem session hour of the mathematics classes. This paper…

  6. Computer Aided Learning of Mathematics: Software Evaluation

    ERIC Educational Resources Information Center

    Yushau, B.; Bokhari, M. A.; Wessels, D. C. J.

    2004-01-01

    Computer Aided Learning of Mathematics (CALM) has been in use for some time in the Prep-Year Mathematics Program at King Fahd University of Petroleum & Minerals. Different kinds of software (both locally designed and imported) have been used in the quest of optimizing the recitation/problem session hour of the mathematics classes. This paper…

  7. Methods, Software and Tools for Three Numerical Applications. Final report

    SciTech Connect

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  8. STADIUM FLIR: a software tool for FLIR92 and ACQUIRE

    NASA Astrophysics Data System (ADS)

    Hess, Glenn T.; Sanders, Thomas J.

    2000-07-01

    FLIR92 and ACQUIRE have become the standard simulation models used in virtually all Forward Looking Infrared (FLIR) system design. Recently, a software program called STADIUM FLIR has been written for use with the U.S. Army's FLIR92 and ACQUIRE models. This software provides many performance and ease of use enhancements for the models. Some of these enhancements include graphical user interfaces for all model parameter entry, data extraction between FLIR92 and ACQUIRE as well as comprehensive plotting of output curves. All data extraction and plotting is automatic and seamless. STADIUM FLIR is based on AET's STADIUM technology which adds powerful Design of Experiments and statistical analysis capabilities to simulation environments. The results are presented both quantitatively and graphically. STADIUM FLIR provides comprehensive plotting capabilities for both raw data as well as `overlayed' statistical variability data. STADIUM FLIR provides the power to perform multiple FLIR92 and ACQUIRE simulations with inputs (even multiple targets) varying over user specified ranges. This paper will describe the software and how it enhances the power of FLIR92 and ACQUIRE.

  9. Evaluating software development characteristics: Assessment of software measures in the Software Engineering Laboratory. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1981-01-01

    Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.

  10. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    EPA Science Inventory

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  11. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    EPA Science Inventory

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  12. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    SciTech Connect

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  13. A Software Tool for Processing the Displacement Time Series Extracted from Raw Radar Data

    NASA Astrophysics Data System (ADS)

    Coppi, Francesco; Gentile, Carmelo; Paolo Ricci, Pier

    2010-05-01

    The application of high-resolution radar waveform and interferometric principles recently led to the development of a microwave interferometer, suitable to simultaneously measuring the (static or dynamic) deflection of several points on a large structure. From the technical standpoint, the sensor is a Stepped Frequency Continuous Wave (SF-CW), coherent radar, operating in the Ku frequency band. In the paper, the main procedures adopted to extract the deflection time series from raw radar data and to assess the quality of data are addressed, and the MATLAB toolbox developed is described. Subsequently, other functions implemented in the software tool (e.g. evaluation of the spectral matrix of the deflection time-histories, identification of natural frequencies and operational mode shapes evaluation) are described and the application to data recorded on full-scale bridges is exemplified.

  14. A software tool for automatic classification and segmentation of 2D/3D medical images

    NASA Astrophysics Data System (ADS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-02-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  15. Evaluation of selected environmental decision support software

    SciTech Connect

    Sullivan, T.M.; Moskowitz, P.D.; Gitten, M.

    1997-06-01

    Decision Support Software (DSS) continues to be developed to support analysis of decisions pertaining to environmental management. Decision support systems are computer-based systems that facilitate the use of data, models, and structured decision processes in decision making. The optimal DSS should attempt to integrate, analyze, and present environmental information to remediation project managers in order to select cost-effective cleanup strategies. The optimal system should have a balance between the sophistication needed to address the wide range of complicated sites and site conditions present at DOE facilities, and ease of use (e.g., the system should not require data that is typically unknown and should have robust error checking of problem definition through input, etc.). In the first phase of this study, an extensive review of the literature, the Internet, and discussions with sponsors and developers of DSS led to identification of approximately fifty software packages that met the preceding definition.

  16. A Process for COTS Software Product Evaluation

    DTIC Science & Technology

    2004-07-01

    gathered to support the measurement method selected, then the criterion is not good. For example, “quality of engineering ” is not a good criterion...Ncube 99] Ncube, C. & Maiden, N. A. M. “PORE: Procurement Oriented Require- ments Engineering Method for the Component-Based Systems Engineering ...Christos Scondras Chief of Programs, XPK This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a federally

  17. Evaluation of competing software reliability predictions

    NASA Technical Reports Server (NTRS)

    Abdel-Ghaly, A. A.; Chan, P. Y.; Littlewood, B.

    1986-01-01

    Different software reliability models can produce very different answers when called upon to predict future reliability in a reliability growth context. Users need to know which, if any, of the competing predictions are trustworthy. Some techniques are presented which form the basis of a partial solution to this problem. Rather than attempting to decide which model is generally best, the approach adopted here allows a user to decide upon the most appropriate model for each application.

  18. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  19. Slower Algebra Students Meet Faster Tools: Solving Algebra Word Problems with Graphing Software

    ERIC Educational Resources Information Center

    Yerushalmy, Michal

    2006-01-01

    The article discusses the ways that less successful mathematics students used graphing software with capabilities similar to a basic graphing calculator to solve algebra problems in context. The study is based on interviewing students who learned algebra for 3 years in an environment where software tools were always present. We found differences…

  20. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    ERIC Educational Resources Information Center

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  1. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  2. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  3. Air traffic management evaluation tool

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar (Inventor); Sheth, Kapil S. (Inventor); Chatterji, Gano Broto (Inventor); Bilimoria, Karl D. (Inventor); Grabbe, Shon (Inventor); Schipper, John F. (Inventor)

    2012-01-01

    Methods for evaluating and implementing air traffic management tools and approaches for managing and avoiding an air traffic incident before the incident occurs. A first system receives parameters for flight plan configurations (e.g., initial fuel carried, flight route, flight route segments followed, flight altitude for a given flight route segment, aircraft velocity for each flight route segment, flight route ascent rate, flight route descent route, flight departure site, flight departure time, flight arrival time, flight destination site and/or alternate flight destination site), flight plan schedule, expected weather along each flight route segment, aircraft specifics, airspace (altitude) bounds for each flight route segment, navigational aids available. The invention provides flight plan routing and direct routing or wind optimal routing, using great circle navigation and spherical Earth geometry. The invention provides for aircraft dynamics effects, such as wind effects at each altitude, altitude changes, airspeed changes and aircraft turns to provide predictions of aircraft trajectory (and, optionally, aircraft fuel use). A second system provides several aviation applications using the first system. Several classes of potential incidents are analyzed and averted, by appropriate change en route of one or more parameters in the flight plan configuration, as provided by a conflict detection and resolution module and/or traffic flow management modules. These applications include conflict detection and resolution, miles-in trail or minutes-in-trail aircraft separation, flight arrival management, flight re-routing, weather prediction and analysis and interpolation of weather variables based upon sparse measurements. The invention combines these features to provide an aircraft monitoring system and an aircraft user system that interact and negotiate changes with each other.

  4. Managing clinical research data: software tools for hypothesis exploration.

    PubMed

    Starmer, C F; Dietz, M A

    1990-07-01

    Data representation, data file specification, and the communication of data between software systems are playing increasingly important roles in clinical data management. This paper describes the concept of a self-documenting file that contains annotations or comments that aid visual inspection of the data file. We describe access of data from annotated files and illustrate data analysis with a few examples derived from the UNIX operating environment. Use of annotated files provides the investigator with both a useful representation of the primary data and a repository of comments that describe some of the context surrounding data capture.

  5. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  6. Evaluation of clinical information modeling tools.

    PubMed

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  7. Software Tools | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  8. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  9. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  10. Software Development Information Supported by Typical CASE Tools

    DTIC Science & Technology

    1991-03-01

    The work described in this report was accomplished as part of the Distributed Computing Design System (DCDS) evaluation project, AIRMICS Report ASQB...CASE environment. The DCDS evaluation technical report consists of five separate but related reports which evaluate the Distributed Computing Design

  11. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  12. A software tool for stitching two PET/CT body segments into a single whole-body image set.

    PubMed

    Chang, Tingting; Chang, Guoping; Clark, John W; Rohren, Eric M; Mawlawi, Osama R

    2012-05-10

    A whole-body PET/CT scan extending from the vertex of the head to the toes of the patient is not feasible on a number of commercially available PET/CT scanners due to a limitation in the extent of bed travel on these systems. In such cases, the PET scan has to be divided into two parts: one covering the upper body segment, while the other covering the lower body segment. The aim of this paper is to describe and evaluate, using phantom and patient studies, a software tool that was developed to stitch two body segments and output a single whole-body image set, thereby facilitating the interpretation of whole-body PET scans. A mathematical model was first developed to stitch images from two body segments using three landmarks. The model calculates the relative positions of the landmarks on the two segments and then generates a rigid transformation that aligns these landmarks on the two segments. A software tool was written to implement this model while correcting for radioactive decay between the two body segments, and output a single DICOM whole-body image set with all the necessary tags. One phantom, and six patient studies were conducted to evaluate the performance of the software. In these studies, six radio-opaque markers (BBs) were used as landmarks (three on each leg). All studies were acquired in two body segments with BBs placed in the overlap region of the two segments. The PET/CT images of each segment were then stitched using the software tool to create a single DICOM whole-body PET/CT image. Evaluation of the stitching tool was based on visual inspection, consistency of radiotracer uptake in the two segments, and ability to display the resultant DICOM image set on two independent workstations. The software tool successfully stitched the two segments of the phantom image, and generated a single whole-body DICOM PET/CT image set that had the correct alignment and activity concentration throughout the image. The stitched images were viewed by two independent

  13. Evaluation of air pollution modelling tools as environmental engineering courseware.

    PubMed

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  14. Program Instrumentation: A Technique for Evaluating Educational Software.

    ERIC Educational Resources Information Center

    Bergeron, Bryan P.

    1990-01-01

    Discussion of educational software evaluation highlights an evaluation based on program instrumentation of a medical school simulation program. Evaluation strategies are discussed; evaluation techniques, including interviews, questionnaires, and observation are described; and implications of the results of the program instrumentation for the…

  15. lipID--a software tool for automated assignment of lipids in mass spectra.

    PubMed

    Hübner, Göran; Crone, Catharina; Lindner, Buko

    2009-12-01

    A new software tool called lipID is reported, which supports the identification of glycerophospholipids, glycosphingolipids, fatty acids and small oligosaccharides in mass spectra. The user-extendable software is a Microsoft (MS) Excel Add-In developed using Visual Basic for Applications and is compatible with all Versions of MS Excel since MS Excel 97. It processes singly given mass-to-charge values as well as mass lists considering a number of user-defined options. The software's mode of operation, usage and options are explained and the benefits and limitations of the tool are illustrated by means of three typical analytical examples of lipid analyses.

  16. The Web Interface Template System (WITS), a software developer`s tool

    SciTech Connect

    Lauer, L.J.; Lynam, M.; Muniz, T.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  17. An internet-based software tool for submitting crime information to forensic laboratories

    NASA Astrophysics Data System (ADS)

    Ahluwalia, Rashpal S.; Govindarajulu, Sriram

    2004-11-01

    This paper describes an internet-based software tool developed for the West Virginia State Police Forensics Laboratory. The software enables law enforcement agents to submit crime information to the Forensic Laboratory via a secure Internet connection. Online electronic forms were created to mirror the existing paper based forms, making the transition easier. The process of submitting case information was standardized and streamlined, there by minimizing information inconsistency. The crime information once gathered is automatically stored in a database, and can be viewed and queried by any authorized law enforcement officers. The software tool will be deployed in all counties of WV.

  18. Software tools for developing an acoustics multimedia CD-ROM

    NASA Astrophysics Data System (ADS)

    Bigelow, Todd W.; Wheeler, Paul A.

    2003-10-01

    A multimedia CD-ROM was developed to accompany the textbook, Science of Sound, by Tom Rossing. This paper discusses the multimedia elements included in the CD-ROM and the various software packages used to create them. PowerPoint presentations with an audio-track background were converted to web pages using Impatica. Animations of acoustic examples and quizzes were developed using Flash by Macromedia. Vegas Video and Sound Forge by Sonic Foundry were used for editing video and audio clips while Cleaner by Discreet was used to compress the clips for use over the internet. Math tutorials were presented as whiteboard presentations using Hitachis Starboard to create the graphics and TechSmiths Camtasia Studio to record the presentations. The CD-ROM is in a web-page format created with Macromedias Dreamweaver. All of these elements are integrated into a single course supplement that can be viewed by any computer with a web browser.

  19. Hard- and Software Tools for the Education of Geodetic VLBI

    NASA Astrophysics Data System (ADS)

    Hobiger, Thomas; Haas, Rüdiger; Varenius, Eskil

    2016-12-01

    The Onsala Space Observatory hosts two 2.3-m radio telescopes called SALSA (``Such a lovely small antenna'') which are utilized to bring front-line interactive astronomy to the classroom. Until now SALSA was used for astronomical educational purposes solely, in particular demonstrating the concept of single dish measurements. However, it is possible to combine both SALSAs to form an interferometer by making use of hardware which has been developed for software-defined radio. In doing so, one can utilize the SALSA antenna pair as a student demonstrator for geodetic Very Long Baseline Interferometry. Here is discussed which COTS hardware components are necessary to turn the SALSA installation into an interferometer. A simple Octave-based correlator has been written in order to process SALSA data. Results from a test run during which the Sun was tracked are presented and discussed here.

  20. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  1. Multiscale Software Tool for Controls Prototyping in Supersonic Combustors

    DTIC Science & Technology

    2004-04-01

    such systems for prototyping and design optimization becomes a formidable task. Present-day computational fluid dynamics (CFD) tools have found...are the activation function and the synaptic weights. The activation function is typically a sigmoid, or for a greater dynamic range, a hyperbolic...built-in capability to adapt their synaptic weights to changes in the surrounding environment. A neural network trained in a specific environment can

  2. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  3. Biogem: an effective tool-based approach for scaling up open source software development in bioinformatics.

    PubMed

    Bonnal, Raoul J P; Aerts, Jan; Githinji, George; Goto, Naohisa; MacLean, Dan; Miller, Chase A; Mishima, Hiroyuki; Pagani, Massimiliano; Ramirez-Gonzalez, Ricardo; Smant, Geert; Strozzi, Francesco; Syme, Rob; Vos, Rutger; Wennblom, Trevor J; Woodcroft, Ben J; Katayama, Toshiaki; Prins, Pjotr

    2012-04-01

    Biogem provides a software development environment for the Ruby programming language, which encourages community-based software development for bioinformatics while lowering the barrier to entry and encouraging best practices. Biogem, with its targeted modular and decentralized approach, software generator, tools and tight web integration, is an improved general model for scaling up collaborative open source software development in bioinformatics. Biogem and modules are free and are OSS. Biogem runs on all systems that support recent versions of Ruby, including Linux, Mac OS X and Windows. Further information at http://www.biogems.info. A tutorial is available at http://www.biogems.info/howto.html bonnal@ingm.org.

  4. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    PubMed

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  5. Introduction of software tools for epidemiological surveillance in infection control in Colombia.

    PubMed

    Hernández-Gómez, Cristhian; Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information.

  6. Introduction of software tools for epidemiological surveillance in infection control in Colombia

    PubMed Central

    Motoa, Gabriel; Vallejo, Marta; Blanco, Víctor M; Correa, Adriana; de la Cadena, Elsa; Villegas, María Virginia

    2015-01-01

    Introduction: Healthcare-Associated Infections (HAI) are a challenge for patient safety in the hospitals. Infection control committees (ICC) should follow CDC definitions when monitoring HAI. The handmade method of epidemiological surveillance (ES) may affect the sensitivity and specificity of the monitoring system, while electronic surveillance can improve the performance, quality and traceability of recorded information. Objective: To assess the implementation of a strategy for electronic surveillance of HAI, Bacterial Resistance and Antimicrobial Consumption by the ICC of 23 high-complexity clinics and hospitals in Colombia, during the period 2012-2013. Methods: An observational study evaluating the introduction of electronic tools in the ICC was performed; we evaluated the structure and operation of the ICC, the degree of incorporation of the software HAI Solutions and the adherence to record the required information. Results: Thirty-eight percent of hospitals (8/23) had active surveillance strategies with standard criteria of the CDC, and 87% of institutions adhered to the module of identification of cases using the HAI Solutions software. In contrast, compliance with the diligence of the risk factors for device-associated HAIs was 33%. Conclusions: The introduction of ES could achieve greater adherence to a model of active surveillance, standardized and prospective, helping to improve the validity and quality of the recorded information. PMID:26309340

  7. A Checklist for Evaluating Content-Based Hypertext Computer Software.

    ERIC Educational Resources Information Center

    Tolhurst, Denise

    1992-01-01

    Discusses issues involved in the evaluation of hypertext computer software, including implementation considerations, documentation and packaging, control of instruction by learners, navigation aids for readers, linking mechanisms in hypertext applications, and curriculum development and classroom management considerations. A checklist summarizing…

  8. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  9. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  10. A Tool-Bearing Host in the National Software Works

    DTIC Science & Technology

    1977-04-01

    INSTRUCTIONS BEFORE COMPLETING FORM ORT NUMBER^^T (Jl/J CCN/TRlfl ( 2. GOVT ACCESSION NO 3. RECIPIENT’S CATALOG NUMBER LE (and Sublirle) A...xam_£ijde 4P10__j ,r ÄRPA 6rder1&-2 543/3 11. CONTROLLING OFFICE NAME AND ADDRESS Advanced Research Projects Agency 1400 Wilson Blvd Iton...the installation of batch tools at CCN. Section 4 describes research relating to the design of the file-movement mechanism for NSW, the File Package

  11. Evaluating a Multimedia Authoring Tool.

    ERIC Educational Resources Information Center

    John, Bonnie E.; Mashyna, Matthew M.

    1997-01-01

    Presents a case study of a computer scientist learning and using the Cognitive Walkthrough (CW) technique to assess a multimedia authoring tool. Compares predictions by the analysis to the usability problems found in empirical usability tests. Presents several hypotheses about the cause of low effectiveness, which suggest that additional…

  12. Medical image database for software and algorithm evaluation

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Furuie, Sergio S.

    2005-04-01

    This work presents the development of a framework to make available a free, online, multipurpose and multimodality medical image database for software and algorithm evaluation. We have implemented a distributed architecture for medical image database, including authoring, storage, and repository for documents and image processing software. The system aims to offer a complete test bed and a set of resources including software, link to scientific papers, gold standards, reference images and post-processed images, enabling medical image processing community (scientists, physicians, students and industrials) to be more aware of evaluation issues. Our focus of development was on convenience and easy of use of a generic system adaptable to different contexts.

  13. A software metric for the evaluation of testing efficiency

    NASA Astrophysics Data System (ADS)

    Petkov, Alexander

    2016-12-01

    This article introduces and examines a metric for the evaluation of the software testing process efficiency. The article examines the characteristics of the software testing process. The main activities within the process and the expected results are defined. A software metric for the evaluation of the testing efficiency is defined. The metric is based on the amount of bugs discovered during testing and bugs discovered by clients. The metric was applied to four projects of a single organization. The results from the metric were examined for correlation with the information needs for the metric.

  14. A diagnostic tool for malaria based on computer software.

    PubMed

    Kotepui, Manas; Uthaisar, Kwuntida; Phunphuech, Bhukdee; Phiwklam, Nuoil

    2015-11-12

    Nowadays, the gold standard method for malaria diagnosis is a staining of thick and thin blood film examined by expert laboratorists. It requires well-trained laboratorists, which is a time consuming task, and is un-automated protocol. For this study, Maladiag Software was developed to predict malaria infection in suspected malaria patients. The demographic data of patients, examination for malaria parasites, and complete blood count (CBC) profiles were analyzed. Binary logistic regression was used to create the equation for the malaria diagnosis. The diagnostic parameters of the equation were tested on 4,985 samples (703 infected and 4,282 control samples). The equation indicated 81.2% sensitivity and 80.3% specificity for predicting infection of malaria. The positive likelihood and negative likelihood ratio were 4.12 (95% CI = 4.01-4.23) and 0.23 (95% CI = 0.22-0.25), respectively. This parameter also had odds ratios (P value < 0.0001, OR = 17.6, 95% CI = 16.0-19.3). The equation can predict malaria infection after adjust for age, gender, nationality, monocyte (%), platelet count, neutrophil (%), lymphocyte (%), and the RBC count of patients. The diagnostic accuracy was 0.877 (Area under curve, AUC) (95% CI = 0.871-0.883). The system, when used in combination with other clinical and microscopy methods, might improve malaria diagnoses and enhance prompt treatment.

  15. Evaluation Tools: Student's Assessment of Faculty.

    ERIC Educational Resources Information Center

    Brown, David Lile; Hayes, Evelyn R.

    1979-01-01

    Discusses tools for student evaluation of college teaching in the nursing profession. Because much of the teaching is done in teams rather than by individuals, and much of it occurs outside the traditional classroom, special evaluation tools have been devised and are described. (JOW)

  16. Open software tools for eddy covariance flux partitioning

    USDA-ARS?s Scientific Manuscript database

    Agro-ecosystem management and assessment will benefit greatly from the development of reliable techniques for partitioning evapotranspiration (ET) into evaporation (E) and transpiration (T). Among other activities, flux partitioning can aid in evaluating consumptive vs. non-consumptive agricultural...

  17. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  18. Supervision of tunnelling constructions and software used for their evaluation

    NASA Astrophysics Data System (ADS)

    Caravanas, Aristotelis; Hilar, Matous

    2017-09-01

    Supervision is a common instrument for controlling constructions of tunnels. In order to suit relevant project’s purposes a supervision procedure is modified by local conditions, habits, codes and ways of allocating of a particular tunnelling project. The duties of tunnel supervision are specified in an agreement with the client and they can include a wide range of activities. On large scale tunnelling projects the supervision tasks are performed by a high number of people of different professions. Teamwork, smooth communication and coordination are required in order to successfully fulfil supervision tasks. The efficiency and quality of tunnel supervision work are enhanced when specialized software applications are used. Such applications should allow on-line data management and the prompt evaluation, reporting and sharing of relevant construction information and other aspects. The client is provided with an as-built database that contains all the relevant information related to a construction process, which is a valuable tool for the claim management as well as for the evaluation of structure defects that can occur in the future. As a result, the level of risks related to tunnel constructions is decreased.

  19. Software tools and frameworks in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Brun, R.

    2011-01-01

    In many fields of science and industry the computing environment has grown at an exponential speed in the past 30 years. From ad hoc solutions for each problem, the field has evolved gradually to use or reuse systems developed across the years for the same environment or coming from other fields with the same requirements. Several frameworks have emerged to solve common problems. In High Energy Physics (HEP) and Nuclear Physics, we have witnessed the emergence of common tools, packages and libraries that have become gradually the corner stone of the computing in these fields. The emergence of these systems has been complex because the computing field is evolving rapidly, the problems to be solved more and more complex and the size of the experiments now involving several thousand physicists from all over the world. This paper describes the emergence of these frameworks and their evolution from libraries including independent subroutines to task-oriented packages and to general experiments frameworks.

  20. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  1. Development of Oceanographic Software Tools and Applications for Navy Operational Use

    DTIC Science & Technology

    1997-09-30

    DEVELOPMENT OF OCEANOGRAPHIC SOFTWARE TOOLS AND APPLICATIONS FOR NAVY OPERATIONAL USE James H. Corbin Center for Air Sea Technology Mississippi State...applications, were significantly reduced. Accordingly, the CAST objective for FY97 was to develop interactive graphical tools for shipboard METOC briefers...This was in response to a COMSIXTHFLT validated METOC requirement to provide visualization briefing tools , animations, and 3–D graphical depictions

  2. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  3. Development of evaluation method for software hazard identification techniques

    SciTech Connect

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-07-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  4. Effectiveness of Crown Preparation Assessment Software As an Educational Tool in Simulation Clinic: A Pilot Study.

    PubMed

    Tiu, Janine; Cheng, Enxin; Hung, Tzu-Chiao; Yu, Chuan-Chia; Lin, Tony; Schwass, Don; Al-Amleh, Basil

    2016-08-01

    The aim of this pilot study was to evaluate the feasibility of a new tooth preparation assessment software, Preppr, as an educational tool for dental students in achieving optimal parameters for a crown preparation. In February 2015, 30 dental students in their fourth year in a five-year undergraduate dental curriculum in New Zealand were randomly selected from a pool of volunteers (N=40) out of the total class of 85. The participants were placed into one of three groups of ten students each: Group A, the control group, received only written and pictorial instructions; Group B received tutor evaluation and feedback; and Group C performed self-directed learning with the aid of Preppr. Each student was asked to prepare an all-ceramic crown on the lower first molar typodont within three hours and to repeat the exercise three times over the next four weeks. The exercise stipulated a 1 mm finish line dimension and total convergence angles (TOC) between 10 and 20 degrees. Fulfillment of these parameters was taken as an acceptable preparation. The results showed that Group C had the highest percentage of students who achieved minimum finish line dimensions and acceptable TOC angles. Those students also achieved the stipulated requirements earlier than the other groups. This study's findings provide promising data on the feasibility of using Preppr as a self-directed educational tool for students training to prepare dental crowns.

  5. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    SciTech Connect

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility was needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)

  6. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    PubMed

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development.

  7. Updates to the CMAQ Post Processing and Evaluation Tools for 2016

    EPA Science Inventory

    In the spring of 2016, the evaluation tools distributed with the CMAQ model code were updated and new tools were added to the existing set of tools. Observation data files, compatible with the AMET software, were also made available on the CMAS website for the first time with the...

  8. Updates to the CMAQ Post Processing and Evaluation Tools for 2016

    EPA Science Inventory

    In the spring of 2016, the evaluation tools distributed with the CMAQ model code were updated and new tools were added to the existing set of tools. Observation data files, compatible with the AMET software, were also made available on the CMAS website for the first time with the...

  9. [Construction and evaluation of educational software on urinary indwelling catheters].

    PubMed

    Lopes, Ana Carolina Cristino; de Andrade Ferreira, Andréia; Fernandes, Jussara Alaíde Leite; da Silva Morita, Ana Beatriz Pinto; de Brito Poveda, Vanessa; de Souza, Adriano José Sorbile

    2011-03-01

    Since this is an era in which information is open concerning the benefits it brings, the field of nursing informatics earns its moment. The objective of this study was to design educational software for teaching and learning the technique of urinary indwelling catheterization and compare the acquisition of knowledge regarding the technique before and after the implementation of the educational software. This is a descriptive study using a quantitative approach. The pedagogical foundations for designing the software were the theories of Piaget and Vygotsky. The teaching-learning process was evaluated through a questionnaire consisting of 10 multiple choice questions which the 60 participants completed before and after using the software. The results showed the software made significant contributions after its application, thus being very useful in the teaching-learning process.

  10. A protocol building software tool for medical device quality control tests.

    PubMed

    Theodorakos, Y; Gueorguieva, K; Bliznakov, J; Kolitsi, Z; Pallikarakis, N

    1999-01-01

    Q-Pro is an application for Quality Control and Inspection of Medical Devices. General system requirements include friendly and comprehensive graphical environment and proper, quick, easy and intuitive user interface. Functions such as, a tool library for protocol design widely used multimedia, as well as, a support of a local database for protocol and inventory data archiving are provided by the system. In order to serve the different categories of users, involved in Quality Control procedures, the system has been split into three modules of different functionality and complexity, each of which can work as a stand-alone application. The implementation of protocols and use of the software functions, as well as, the user interface itself have been proved by the evaluators to be clear and intuitive. The software seems to adapt easily to different kinds of Quality Control procedures and objectives. Q-Pro effectively supports and enhances the processes to attain a highly tuned, professional, responsive and effective quality control and preventive maintenance procedures for biomedical equipment management.

  11. Programming Tools: Status, Evaluation, and Comparison

    NASA Technical Reports Server (NTRS)

    Cheng, Doreen Y.; Cooper, D. M. (Technical Monitor)

    1994-01-01

    In this tutorial I will first describe the characteristics of scientific applications and their developers, and describe the computing environment in a typical high-performance computing center. I will define the user requirements for tools that support application portability and present the difficulties to satisfy them. These form the basis of the evaluation and comparison of the tools. I will then describe the tools available in the market and the tools available in the public domain. Specifically, I will describe the tools for converting sequential programs, tools for developing portable new programs, tools for debugging and performance tuning, tools for partitioning and mapping, and tools for managing network of resources. I will introduce the main goals and approaches of the tools, and show main features of a few tools in each category. Meanwhile, I will compare tool usability for real-world application development and compare their different technological approaches. Finally, I will indicate the future directions of the tools in each category.

  12. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.G.

    1990-09-01

    This report addresses the need for commercially available lighting design computer programs and evaluates several of these programs. Sandia National Laboratories uses these programs to provide lighting designs for exterior closed-circuit television camera intrusion detection assessment for high-security perimeters.

  13. PROBEmer: a web-based software tool for selecting optimal DNA oligos

    PubMed Central

    Emrich, Scott J.; Lowe, Mary; Delcher, Arthur L.

    2003-01-01

    PROBEmer (http://probemer.cs.loyola.edu) is a web-based software tool that enables a researcher to select optimal oligos for PCR applications and multiplex detection platforms including oligonucleotide microarrays and bead-based arrays. Given two groups of nucleic-acid sequences, a target group and a non-target group, the software identifies oligo sequences that occur in members of the target group, but not in the non-target group. To help predict potential cross hybridization, PROBEmer computes all near neighbors in the non-target group and displays their alignments. The software has been used to obtain genus-specific prokaryotic probes based on the 16S rRNA gene, gene-specific probes for expression analyses and PCR primers. In this paper, we describe how to use PROBEmer, the computational methods it employs, and experimental results for oligos identified by this software tool. PMID:12824409

  14. Evaluating uncertainty in integrated environmental models: A review of concepts and tools

    NASA Astrophysics Data System (ADS)

    Matott, L. Shawn; Babendreier, Justin E.; Purucker, S. Thomas

    2009-06-01

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with "standard" definitions are provided in the context of integrated approaches to environmental assessment. These constructs provide a reference point for cataloging 65 different model evaluation tools. Each tool is described briefly (in the auxiliary material) and is categorized for applicability across seven thematic model evaluation methods. Ratings for citation count and software availability are also provided, and a companion Web site containing download links for tool software is introduced. The paper concludes by reviewing strategies for tool interoperability and offers guidance for both practitioners and tool developers.

  15. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  16. Evaluating School Counseling Websites: An Evaluation Tool

    ERIC Educational Resources Information Center

    Reynolds, Glenda P.; Kitchens, Helen

    2007-01-01

    The purpose of this paper is to describe the use of a webpage evaluation for embedding technology in classes for teaching school counseling and counseling program development. The instructors created the Website Evaluation Form to help students recognize qualities of webpages that would enhance the school counseling program, broaden their…

  17. Computational approaches and software tools for genetic linkage map estimation in plants.

    PubMed

    Cheema, Jitender; Dicks, Jo

    2009-11-01

    Genetic maps are an important component within the plant biologist's toolkit, underpinning crop plant improvement programs. The estimation of plant genetic maps is a conceptually simple yet computationally complex problem, growing ever more so with the development of inexpensive, high-throughput DNA markers. The challenge for bioinformaticians is to develop analytical methods and accompanying software tools that can cope with datasets of differing sizes, from tens to thousands of markers, that can incorporate the expert knowledge that plant biologists typically use when developing their maps, and that facilitate user-friendly approaches to achieving these goals. Here, we aim to give a flavour of computational approaches for genetic map estimation, discussing briefly many of the key concepts involved, and describing a selection of software tools that employ them. This review is intended both for plant geneticists as an introduction to software tools with which to estimate genetic maps, and for bioinformaticians as an introduction to the underlying computational approaches.

  18. Software Mapping Assessment Tool Documenting Behavioral Content in Computer Interaction: Examples of Mapped Problems with "Kid Pix" Program

    ERIC Educational Resources Information Center

    Bayram, Servet

    2005-01-01

    The purpose of software mapping is to delineate a method for software menu, tool, and palette use in the construction of elementary school science and mathematics curriculum activities. With this method, software "maps" were created for traversing science and math curriculum problems and activities using software. The other purpose of…

  19. Virtual chromoendoscopy can be a useful software tool in capsule endoscopy.

    PubMed

    Duque, Gabriela; Almeida, Nuno; Figueiredo, Pedro; Monsanto, Pedro; Lopes, Sandra; Freire, Paulo; Ferreira, Manuela; Carvalho, Rita; Gouveia, Hermano; Sofia, Carlos

    2012-05-01

    capsule endoscopy (CE) has revolutionized the study of small bowel. One major drawback of this technique is that we cannot interfere with image acquisition process. Therefore, the development of new software tools that could modify the images and increase both detection and diagnosis of small-bowel lesions would be very useful. The Flexible Spectral Imaging Color Enhancement (FICE) that allows for virtual chromoendoscopy is one of these software tools. to evaluate the reproducibility and diagnostic accuracy of the FICE system in CE. this prospective study involved 20 patients. First, four physicians interpreted 150 static FICE images and the overall agreement between them was determined using the Fleiss Kappa Test. Second, two experienced gastroenterologists, blinded to each other results, analyzed the complete 20 video streams. One interpreted conventional capsule videos and the other, the CE-FICE videos at setting 2. All findings were reported, regardless of their clinical value. Non-concordant findings between both interpretations were analyzed by a consensus panel of four gastroenterologists who reached a final result (positive or negative finding). in the first arm of the study the overall concordance between the four gastroenterologists was substantial (0.650). In the second arm, the conventional mode identified 75 findings and the CE-FICE mode 95. The CE-FICE mode did not miss any lesions identified by the conventional mode and allowed the identification of a higher number of angiodysplasias (35 vs 32), and erosions (41 vs. 24). there is reproducibility for the interpretation of CE-FICE images between different observers experienced in conventional CE. The use of virtual chromoendoscopy in CE seems to increase its diagnostic accuracy by highlighting small bowel erosions and angiodysplasias that weren´t identified by the conventional mode.

  20. Online survey software as a data collection tool for medical education: A case study on lesson plan assessment

    PubMed Central

    Kimiafar, Khalil; Sarbaz, Masoumeh; Sheikhtaheri, Abbas

    2016-01-01

    Background: There are no general strategies or tools to evaluate daily lesson plans; however, assessments conducted using traditional methods usually include course plans. This study aimed to evaluate the strengths and weaknesses of online survey software in collecting data on education in medical fields and the application of such softwares to evaluate students' views and modification of lesson plans. Methods: After investigating the available online survey software, esurveypro was selected for assessing daily lesson plans. After using the software for one semester, a questionnaire was prepared to assess the advantages and disadvantages of this method and students’ views in a cross-sectional study. Results: The majority of the students (51.7%) rated the evaluation of classes per session (lesson plans) using the online survey as useful or very useful. About 51% (n=36) of the students considered this method effective in improving the management of each session, 67.1% (n=47) considered it effective in improving the management of sessions for the next semester, and 51.4% (n=36) said it had a high impact on improving the educational content of subsequent sessions. Finally, 61.4% (n=43) students expressed high and very high levels of satisfaction with using an online survey at each session. Conclusion: The use of online surveys may be appropriate to improve lesson plans and educational planning at different levels. This method can be used for other evaluations and for assessing people’s opinions at different levels of an educational system. PMID:28491839

  1. Online survey software as a data collection tool for medical education: A case study on lesson plan assessment.

    PubMed

    Kimiafar, Khalil; Sarbaz, Masoumeh; Sheikhtaheri, Abbas

    2016-01-01

    Background: There are no general strategies or tools to evaluate daily lesson plans; however, assessments conducted using traditional methods usually include course plans. This study aimed to evaluate the strengths and weaknesses of online survey software in collecting data on education in medical fields and the application of such softwares to evaluate students' views and modification of lesson plans. Methods: After investigating the available online survey software, esurveypro was selected for assessing daily lesson plans. After using the software for one semester, a questionnaire was prepared to assess the advantages and disadvantages of this method and students' views in a cross-sectional study. Results: The majority of the students (51.7%) rated the evaluation of classes per session (lesson plans) using the online survey as useful or very useful. About 51% (n=36) of the students considered this method effective in improving the management of each session, 67.1% (n=47) considered it effective in improving the management of sessions for the next semester, and 51.4% (n=36) said it had a high impact on improving the educational content of subsequent sessions. Finally, 61.4% (n=43) students expressed high and very high levels of satisfaction with using an online survey at each session. Conclusion: The use of online surveys may be appropriate to improve lesson plans and educational planning at different levels. This method can be used for other evaluations and for assessing people's opinions at different levels of an educational system.

  2. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  3. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  4. The Need for Software Architecture Evaluation in the Acquisition of Software-Intensive Sysetms

    DTIC Science & Technology

    2014-01-01

    scenario generation framework (Bass, Bachmann et al. 2003) Elements Brief Description Stimulus A condition that needs to be considered when it arrives...Architecture Evaluation Methods. 15th Australian Software Engineering Conference. Bass, L., F. Bachmann and M. Klein (2003). "Deriving Architectural

  5. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  6. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    PubMed

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  7. A Framework for the Evaluation of CASE Tool Learnability in Educational Environments

    ERIC Educational Resources Information Center

    Senapathi, Mali

    2005-01-01

    The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…

  8. Screening and Evaluation Tool (SET) Users Guide

    SciTech Connect

    Pincock, Layne

    2014-10-01

    This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.

  9. Stage Separation CFD Tool Development and Evaluation

    NASA Technical Reports Server (NTRS)

    Droege, Alan; Gomez, Reynaldo; Wang, Ten-See

    2002-01-01

    This viewgraph presentation evaluates CFD (Computational Fluid Dynamics) tools for solving stage separation problems. The demonstration and validation of the tools is for a second generation RLV (Reusable Launch Vehicle) stage separation. The flow solvers are: Cart3D; Overflow/Overflow-D; Unic.

  10. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    USGS Publications Warehouse

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  11. Evaluating modeling tools for the EDOS

    NASA Technical Reports Server (NTRS)

    Knoble, Gordon; Mccaleb, Frederick; Aslam, Tanweer; Nester, Paul

    1994-01-01

    The Earth Observing System (EOS) Data and Operations System (EDOS) Project is developing a functional, system performance model to support the system implementation phase of the EDOS which is being designed and built by the Goddard Space Flight Center (GSFC). The EDOS Project will use modeling to meet two key objectives: (1) manage system design impacts introduced by unplanned changed in mission requirements; and (2) evaluate evolutionary technology insertions throughout the development of the EDOS. To select a suitable modeling tool, the EDOS modeling team developed an approach for evaluating modeling tools and languages by deriving evaluation criteria from both the EDOS modeling requirements and the development plan. Essential and optional features for an appropriate modeling tool were identified and compared with known capabilities of several modeling tools. Vendors were also provided the opportunity to model a representative EDOS processing function to demonstrate the applicability of their modeling tool to the EDOS modeling requirements. This paper emphasizes the importance of using a well defined approach for evaluating tools to model complex systems like the EDOS. The results of this evaluation study do not in any way signify the superiority of any one modeling tool since the results will vary with the specific modeling requirements of each project.

  12. Knowledge-engineering software. A demonstration of a high-end tool.

    PubMed

    Salzman, G C; Krall, R B; Marinuzzi, J G

    1988-06-01

    Many investigators wanting to apply knowledge-based systems (KBSs) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with such high-end KBS tools as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. This paper illustrates the capability of one of these high-end tools. First, a table showing the classification of benign soft tissue tumors was converted into a KEE knowledge base. The tools available in KEE were then used to identify the tumor type for a hypothetical patient.

  13. Proceedings of the Workshop on software tools for distributed intelligent control systems

    SciTech Connect

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  14. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    SciTech Connect

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A.

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  15. Utilizing self-assessment software to evaluate student wax-ups in dental morphology.

    PubMed

    McPherson, Karen R; Mennito, Anthony S; Vuthiganon, Jompobe; Kritzas, Yianne G; McKinney, Richard A; Wolf, Bethany J; Renne, Walter G

    2015-06-01

    Traditionally, evaluating student work in preclinical courses has relied on the judgment of experienced clinicians utilizing visual inspection. However, research has shown significant disagreement between different evaluators (interrater reliability) and between results from the same evaluator at different times (intrarater reliability). This study evaluated a new experimental software (E4D Compare) to compare 66 student-produced tooth wax-ups at one U.S. dental school to an ideal standard after both had been digitally scanned. Using 3D surface-mapping technology, a numerical evaluation was generated by calculating the surface area of the student's work that was within a set range of the ideal. The aims of the study were to compare the reliability of faculty and software grades and to determine the ideal tolerance value for the software. The investigators hypothesized that the software would provide more consistent feedback than visual grading and that a tolerance value could be determined that closely correlated with the faculty grade. The results showed that a tolerance level of 450 μm provided 96% agreement of grades compared with only 53% agreement for faculty. The results suggest that this software could be used by faculty members as a mechanism to evaluate student work and for students to use as a self-assessment tool.

  16. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  17. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  18. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    NASA Astrophysics Data System (ADS)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  19. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  20. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    PubMed

    Kong, Xiaoquan; Huang, Minyi; Duan, Renyan

    2015-01-01

    It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs). There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata) to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF) and check species names and the accuracy of coordinates (latitude and longitude). It is an open source software (GNU Affero General Public License/AGPL licensed) allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  1. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  2. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    PubMed

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow.

  3. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  4. mMass as a software tool for the annotation of cyclic peptide tandem mass spectra.

    PubMed

    Niedermeyer, Timo H J; Strohalm, Martin

    2012-01-01

    Natural or synthetic cyclic peptides often possess pronounced bioactivity. Their mass spectrometric characterization is difficult due to the predominant occurrence of non-proteinogenic monomers and the complex fragmentation patterns observed. Even though several software tools for cyclic peptide tandem mass spectra annotation have been published, these tools are still unable to annotate a majority of the signals observed in experimentally obtained mass spectra. They are thus not suitable for extensive mass spectrometric characterization of these compounds. This lack of advanced and user-friendly software tools has motivated us to extend the fragmentation module of a freely available open-source software, mMass (http://www.mmass.org), to allow for cyclic peptide tandem mass spectra annotation and interpretation. The resulting software has been tested on several cyanobacterial and other naturally occurring peptides. It has been found to be superior to other currently available tools concerning both usability and annotation extensiveness. Thus it is highly useful for accelerating the structure confirmation and elucidation of cyclic as well as linear peptides and depsipeptides.

  5. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  6. DairyGEM: A software tool for assessing emissions and mitigation strategies for dairy production systems

    USDA-ARS?s Scientific Manuscript database

    Many gaseous compounds are emitted from dairy farms. Those of current interest include the toxic compounds of ammonia and hydrogen sulfide and the greenhouse gases of methane, nitrous oxide and carbon dioxide. A relatively easy to use software tool was developed that predicts these emissions through...

  7. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    ERIC Educational Resources Information Center

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  8. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  9. Understanding Collaborative Learning: Small Group Work on Contextual Problems Using a Multi-Representational Software Tool.

    ERIC Educational Resources Information Center

    Smith, Erick; Confrey, Jere

    The interactions of three high school juniors (two females and one male) working together on a series of contextual mathematics problems using a multirepresentational software tool were studied. Focus was on determining how a constructivist model of learning, based on an individual problematic-action-reflection model, can be extended to offer…

  10. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  11. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    ERIC Educational Resources Information Center

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  12. DairyGEM: a software tool for whole farm assessment of emission mitigation strategies

    USDA-ARS?s Scientific Manuscript database

    Accurate assessment of the impact of management on agricultural emissions requires consideration of many farm components and their interactions. A comprehensive assessment is needed because changes made to reduce one emission type or source may increase another. A new software tool was developed tha...

  13. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    ERIC Educational Resources Information Center

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  14. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  15. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    SciTech Connect

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  16. Biogem: an effective tool-based approach for scaling up open source software development in bioinformatics

    PubMed Central

    Bonnal, Raoul J.P.; Aerts, Jan; Githinji, George; Goto, Naohisa; MacLean, Dan; Miller, Chase A.; Mishima, Hiroyuki; Pagani, Massimiliano; Ramirez-Gonzalez, Ricardo; Smant, Geert; Strozzi, Francesco; Syme, Rob; Vos, Rutger; Wennblom, Trevor J.; Woodcroft, Ben J.; Katayama, Toshiaki; Prins, Pjotr

    2012-01-01

    Summary: Biogem provides a software development environment for the Ruby programming language, which encourages community-based software development for bioinformatics while lowering the barrier to entry and encouraging best practices. Biogem, with its targeted modular and decentralized approach, software generator, tools and tight web integration, is an improved general model for scaling up collaborative open source software development in bioinformatics. Availability: Biogem and modules are free and are OSS. Biogem runs on all systems that support recent versions of Ruby, including Linux, Mac OS X and Windows. Further information at http://www.biogems.info. A tutorial is available at http://www.biogems.info/howto.html Contact: bonnal@ingm.org PMID:22332238

  17. Three dimensional planning target volumes: a model and a software tool.

    PubMed

    Austin-Seymour, M; Kalet, I; McDonald, J; Kromhout-Schiro, S; Jacky, J; Hummel, S; Unger, J

    1995-12-01

    Three dimensional (3D) target volumes are an essential component of conformal therapy because the goal is to shape the treatment volume to the target volume. The planning target volume (PTV) is defined by ICRU 50 as the clinical target volume (CTV) plus a margin to ensure that the CTV receives the prescribed dose. The margin must include all interfractional and intrafractional treatment variations. This paper describes a software tool that automatically generates 3D PTVs from CTVs for lung cancers and immobile head and neck cancers. Values for the interfractional and intrafractional treatment variations were determined by a literature review and by targeted interviews with physicians. The software tool is written in Common LISP and conforms to the specifications for shareable software of the Radiotherapy Treatment Planning Tools Collaborative Working Group. The tool is a rule-based expert system in which the inputs are the CTV contours, critical structure contours, and qualitative information about the specific patient. The output is PTV contours, which are a cylindrical expansion of the CTV. A model for creating PTVs from CTVs is embedded in the tool. The interfractional variation of setup uncertainty and the intrafractional variations of movement of the CTV (e.g., respiration) and patient motion are included in the model. Measured data for the component variations is consistent with modeling the components as independent samples from 3D Gaussian distributions. The components are combined using multivariate normal statistics to yield the cylindrical expansion factors. Rules are used to represent the values of the components for certain patient conditions (e.g., setup uncertainty for a head and neck patient immobilized in a mask). The tool uses a rule interpreter to combine qualitative information about a specific patient with rules representing the value of the components and to enter the appropriate component values for that patient into the cylindrical expansion

  18. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  19. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    PubMed

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society

  20. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  1. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    PubMed

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  2. Software Engineering Laboratory (SEL) programmer workbench phase 1 evaluation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Phase 1 of the SEL programmer workbench consists of the design of the following three components: communications link, command language processor, and collection of software aids. A brief description, and evaluation, and recommendations are presented for each of these three components.

  3. Learning English Electronically: Formative Evaluation in ESL Software.

    ERIC Educational Resources Information Center

    Schnackenberg, Heidi L.

    Learning English Electronically (LEE), a computer software package designed for adult English as a Second Language (ESL) students enrolled in intermediate level community college ESL classes, was evaluated at Glendale Community College in Glendale, Arizona to assess student and teacher attitudes toward the program. LEE consists of 43 lessons…

  4. Microcomputers: Instrument Generation Software. Evaluation Guides. Guide Number 11.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    Designed to assist evaluators in selecting the appropriate software for the generation of various data collection instruments, this guide discusses such key program characteristics as text entry, item storage and manipulation, item retrieval, and printing. Some characteristics of a good instrument generation program are discussed; these include…

  5. The Software Jungle: Guidelines for Evaluating English Computer Programs.

    ERIC Educational Resources Information Center

    Hitchcock, Richard B.

    Noting that the educational software market is expanding at a precipitous rate, but with considerable variation in quality, this paper outlines a thorough set of tangible and useful questions, based on two years of program design for English and writing lab usage. The major portion of the paper discusses criteria for evaluation in each of the…

  6. Criteria for the Evaluation of Text Storage and Retrieval Software.

    ERIC Educational Resources Information Center

    Nieuwenhuysen, Paul

    1988-01-01

    Presents criteria in the following areas for evaluation and selection of software for storage, management, and retrieval of text information: (1) input of information; (2) indexing; (3) interactive searching for information; (4) output; (5) selective dissemination of information; (6) security; (7) availability of cheaper, limited versions; and (8)…

  7. Evaluation of Software Dependability at the Architecture Definition Stage

    DTIC Science & Technology

    2010-06-01

    2005]. [ Babar et al. 2004] proposes a framework for their comparison and assessment. Here are some examples of architecture oriented approaches...11-33, Jan-Mar 2004 Babar , M, Zhu, L and Jeffrey, R. A Framework for Classifying and Comparing Software Architecture Evaluation Methods, Proc. Of

  8. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    PubMed

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  9. Criteri per la valutazione di software pedagogico grammaticale (Criteria for the Evaluation of Software for the Teaching of Grammar).

    ERIC Educational Resources Information Center

    Bancheri, Salvatore

    1997-01-01

    Presents criteria teachers can use when evaluating software designed to teach grammar to students of Italian-as-a-Second-Language. The importance of collaboration between programmers and academic experts is stressed in creating software and the need for teachers to choose software as carefully as they select textbooks. (CFM)

  10. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    PubMed

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology

    PubMed Central

    Latendresse, Mario; Paley, Suzanne M.; Krummenacker, Markus; Ong, Quang D.; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M.; Caspi, Ron

    2016-01-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. PMID:26454094

  12. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

    NASA Technical Reports Server (NTRS)

    Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

    1997-01-01

    The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

  13. A proposal for reverse engineering CASE tools to support new software development

    SciTech Connect

    Maxted, A.

    1993-06-01

    Current CASE technology provides sophisticated diagramming tools to generate a software design. The design, stored internal to the CASE tool, is bridged to the code via code generators. There are several limitations to this technique: (1) the portability of the design is limited to the portability of the CASE tools, and (2) the code generators offer a clumsy link between design and code. The CASE tool though valuable during design, becomes a hindrance during implementation. Frustration frequently causes the CASE tool to be abandoned during implementation, permanently severing the link between design and code. Current CASE stores the design in a CASE internal structure, from which code is generated. The technique presented herein suggests that CASE tools store the system knowledge directly in code. The CASE support then switches from an emphasis on code generators to employing state-of-the-art reverse engineering techniques for document generation. Graphical and textual descriptions of each software component (e.g., Ada Package) may be generated via reverse engineering techniques from the code. These reverse engineered descriptions can be merged with system over-view diagrams to form a top-level design document. The resulting document can readily reflect changes to the software components by automatically generating new component descriptions for the changed components. The proposed auto documentation technique facilitates the document upgrade task at later stages of development, (e.g., design, implementation and delivery) by using the component code as the source of the component descriptions. The CASE technique presented herein is a unique application of reverse engineering techniques to new software systems. This technique contrasts with more traditional CASE auto code generation techniques.

  14. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  15. ConsensusCluster: a software tool for unsupervised cluster discovery in numerical data.

    PubMed

    Seiler, Michael; Huang, C Chris; Szalma, Sandor; Bhanot, Gyan

    2010-02-01

    We have created a stand-alone software tool, ConsensusCluster, for the analysis of high-dimensional single nucleotide polymorphism (SNP) and gene expression microarray data. Our software implements the consensus clustering algorithm and principal component analysis to stratify the data into a given number of robust clusters. The robustness is achieved by combining clustering results from data and sample resampling as well as by averaging over various algorithms and parameter settings to achieve accurate, stable clustering results. We have implemented several different clustering algorithms in the software, including K-Means, Partition Around Medoids, Self-Organizing Map, and Hierarchical clustering methods. After clustering the data, ConsensusCluster generates a consensus matrix heatmap to give a useful visual representation of cluster membership, and automatically generates a log of selected features that distinguish each pair of clusters. ConsensusCluster gives more robust and more reliable clusters than common software packages and, therefore, is a powerful unsupervised learning tool that finds hidden patterns in data that might shed light on its biological interpretation. This software is free and available from http://code.google.com/p/consensus-cluster .

  16. Backup flight control system functional evaluator software manual

    NASA Technical Reports Server (NTRS)

    Helmke, C. A.; Hasara, S. H.; Mount, F. E.

    1977-01-01

    The software for the Backup Flight Control System Functional Evaluator (BFCSFE) on a Data General Corporation Nova 1200 computer consists of three programs: the ground support program, the operational flight program (OFP), and the ground pulse code modulation (PCM) program. The Nova OFP software is structurally as close as possible to the AP101 code; therefore, this document highlights and describes only those areas of the Nova OFP that are significantly different from the AP101. Since the Ground Support Program was developed to meet BFCSFE requirements and differs considerably from the AP101 code, it is described in detail.

  17. BatchQC: interactive software for evaluating sample and batch effects in genomic data.

    PubMed

    Manimaran, Solaiappan; Selby, Heather Marie; Okrah, Kwame; Ruberman, Claire; Leek, Jeffrey T; Quackenbush, John; Haibe-Kains, Benjamin; Bravo, Hector Corrada; Johnson, W Evan

    2016-12-15

    Sequencing and microarray samples often are collected or processed in multiple batches or at different times. This often produces technical biases that can lead to incorrect results in the downstream analysis. There are several existing batch adjustment tools for '-omics' data, but they do not indicate a priori whether adjustment needs to be conducted or how correction should be applied. We present a software pipeline, BatchQC, which addresses these issues using interactive visualizations and statistics that evaluate the impact of batch effects in a genomic dataset. BatchQC can also apply existing adjustment tools and allow users to evaluate their benefits interactively. We used the BatchQC pipeline on both simulated and real data to demonstrate the effectiveness of this software toolkit.

  18. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  19. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    PubMed

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes.

  20. On the evaluation of segmentation editing tools

    PubMed Central

    Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.

    2014-01-01

    Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063